November 04, 2005
The Visualization Center at San Diego State University, which relied on visualization technology from Silicon Graphics Inc. to create and disseminate 3D geospatial datasets for many projects including natural disaster mitigation and response, has added the processing and serving of NASA's "Blue Marble: Next Generation" satellite imagery of Earth to its list of achievements.
Beginning with high-resolution satellite imagery of Banda Aceh, Indonesia, acquired before and after last year's tsunami, and continuing through this year's U.S. hurricane season with before and after imagery, especially of the devastation of Hurricane Katrina in the Gulf states, SGI compute power and speed helps SDSU deliver 3D geospatial visualization to relief workers and government officials. The Silicon Graphics Prism visualization system is an integral part of the geospatial image processing pipeline for the many efforts at the Visualization Center at SDSU, including homeland security, remote sensing and environmental monitoring, global sharing of information and collaborative visualization.
"Blue Marble: Next Generation" uses imagery from NASA's 18 Earth-observing satellites, down-linked at NASA Earth Observatory at NASA's Goddard Space Flight Center. SDSU both processes and serves out the images using a Silicon Graphics Prism system. "Blue Marble" offers a year's worth of monthly composites at a spatial resolution of 500 meters per pixel. These monthly images reveal seasonal changes to the land surface: the green-up and dying-back of vegetation in temperate regions such as North America and Europe, dry and wet seasons in the tropics, and advancing and retreating Northern Hemisphere snow cover, helping scientists in many disciplines to make more detailed observations of our world.
According to NASA's web site, commenting on the upgrade from the original "Blue Marble," "From a computer processing standpoint, the major improvement [in 'Blue Marble: Next Generation'] is the development of a new technique for allowing the computer to automatically recognize and remove cloud-contaminated or otherwise bad data -- a process that was previously done manually."
Additional processing is accomplished using the GeoMatrix Toolkit from GeoFusion, Inc. (http://www.geofusion.com), which is the backbone of the Visualization Center's high-performance imaging and GIS environment. The scalable computing power and large memory of the Silicon Graphics Prism system allows researchers to use GeoMatrix tools to process data for serving in the GeoPlayer ActiveX web browser plugin.
By visualizing hundreds of gigabytes to many terabytes of geospatial data, the researchers at the Visualization Center at SDSU are able to continuously create up-to-date 3D fly-throughs that depict the changes wrought by a natural disaster. The Silicon Graphics Prism system is the heart of the process and is used to create the new datasets, routinely processing 500 or more image files up to 200 MB in size each night to create mosaics of a terabyte or more. The Silicon Graphics Prism system at SDSU has 24 GB RAM, eight Intel Itanium 2 processors running GeoMatrix and OSSIM open-source tools in the Linux environment. This configuration allows conversion of all data into easily accessible, open-source format; the data is then stored back out to the servers at SDSC for public access on the Internet. In the case of Hurricane Katrina, WMS (web map server) data were served directly from the Silicon Graphics Prism system.
To create the mosaics and 3D fly-throughs that Red Cross and other relief workers would use to determine whether Katrina's victims had a house to return to, SDSU acquired data from a number of sources. Most of the imagery datasets of the affected Gulf states were taken by NOAA with its specially equipped airplane to rapidly acquire high-resolution imagery over the damaged area. Similar high-resolution photography, especially the before imagery was acquired from the USGS EROS Data Center and from other groups such as the Army Corps of Engineers. NASA satellites also provided before and after imagery, which are very good for a regional and multi-spectral perspective that can be combined with the high-resolution photography to provide location and context.
The DigitalGlobe satellite provided 60-centimeter imagery of before and after Katrina, which provided one of the first compelling views of the extraordinary impact of Katrina in both flooding and destruction. The high-resolution, color photography was acquired over and over again as the water drained to provide insight into the change of the water as levees broke and water flooded and receded.
"The Silicon Graphics Prism performed incredibly well," said Eric Frost, co-director of the Visualization Center at SDSU. "There were 5,000 aerial shots in the first batch of after-Katrina photos, and each photo was between 150 and 200 MB. Then 2,500 more would come in the next day, and the next, and so on. An inherent aspect of the photos, especially with low-altitude photography, is that the scale is different from the center of the photo to the outside edge because the airplane is much closer to the center of the shot than it is to the outside." "It's like taking a picture of someone when you're too close, where the nose looks bigger than the rest of the face. In order to put all those photos together into a mosaic and to add GIS data onto them, you actually have to process them in a way where you know exactly what the errors are and you can move all the pixels to the right place," Frost continued. "So just to begin work on the first 5,000 photos, the team of experts that came together around the Silicon Graphics Prism system, color-balanced them and then geo-rectified them -- meaning that you're putting all the pixels where they actually are on the Earth. That normally would be weeks or months of processing, but there was a very special coalition of extremely talented image processors and computer scientists that worked together from a number of different institutions."
Once the photos are geo-rectified, data on locations of roads, city boundaries, hospitals, schools, police stations, and fire stations are added. As more and more information comes in, other data sets are laid on: where the damage is; what assets belong to HUD; where the refineries are; where the gas stations are; and where hazardous waste materials are.
The U.S. Census Bureau's Tiger data, which was linked to the imagery by Howard Butler of Iowa State University, enabling the Katrina relief team to also include something called "geo-coding." Geo-coding means that anyone working for the Red Cross at any shelter in the U.S. could type in an evacuee's address and the computer immediately flies through the 3D geospatial dataset or a flat map to where that address is -- or was -- on that particular city block or country road.
The U.S. Navy also came to the team for geo-coding. The Navy wanted to know how many of their personnel contractors and their dependants had been affected by the disaster. Chuck Stein simply took the address dataset as a database file and, using the GeoMatrix format, which turned the addresses into 3D icons, ran the data through GeoFusion and out of the Silicon Graphics Prism system to a monitor. The results: little yellow flags came up on the addresses of all 8,000 people and it was obvious that several thousand people had lost their homes or were likely evacuated because of the damage to, or location, of their home. Within one day, the Navy was able to narrow the search down to about a hundred people and then find almost all of them the next day.
Once the data is uploaded to SDSU's Web site, any number of researchers and government agencies, or the public can use it. One of the most powerful uses of the data was for locating toxicity and medical data onto the images; and helping interpret the ongoing stream of environmental and monitoring data that are currently being collected by many groups who are helping with the decision-making for the future of the region and people.
Duke University professor Marie Lynn Miranda, a specialist in children's environmental health, considers lead contamination levels one of the biggest issues in the recovery. Professor Miranda is leading the National Institute for Environmental Health and Safety (NIEHS) effort, and the Visualization Center at SDSU is providing the computation power on the Silicon Graphics Prism system for the work she and others are doing.
Researchers at UCSD who work with NIEHS on Superfund site efforts, including professors Bob Tukey and Mark Ellisman, connected SDSU and its imaging capabilities with Professor Miranda and her GIS expertise and strategic position in helping decision-makers understand what the problems were and what possible solutions might be. Researchers like Professors Miranda, Tukey and Ellisman are using the completed data sets, adding census data and on-going environmental measurements, to attempt to determine areas where people could move back to or sections where people -- especially children who are more likely to develop severe physical problems from exposure to high lead levels -- should never return. Being able to serve the terabytes of imagery and GIS data to researchers and field workers both in the affected area and in decision centers focused on helping the people of all the affected states will be something that should provide a significant service to the nation for many years to come as the long-term medical impacts of Katrina and other hurricanes come to light.
"To really do the right thing for the people ravaged by Katrina, it takes a massive amount of data fusion --compositing the toxicity and damage data with imagery and then tracking the changes through time," said John Graham, the Visualization Center's Senior Research Scientist who helped lead the effort with processing and building the social network of specialists who made the Silicon Graphics Prism perform so remarkably. "When you're working with satellite and aerial photography, you can be dealing with multiple terabytes of data. This is where the Silicon Graphics Prism system really shows off the power of its shared-memory architecture, with its ability to take all the bricks and connect them to appear as one large computer with lots of memory. Then taking the geo-referenced imagery and 'cooking' it into GeoFusion OpenGL texture format and storing it on high-speed servers, allowing anybody with a Windows PC that has a OpenGL video card and Internet Explorer -- which almost all new machines do -- to use the ActiveX web browser plugin and fly through those terabytes of data. But it's SGI technology, processing the data on the backend, that is making this all possible."
While gratified by the huge amount of work by fellow scientists, researchers, and the numerous volunteers it took to deliver these data sets to help Katrina victims, including GeoFusion, who wrote code to help everything move even faster, Frost envisions much faster access to the original data, both through networks like the National Lambda Rail, a dark-fiber grid used by universities, and by the addition of 10 Gigabit Ethernet connections, especially at government facilities like the EROS Data Center and NASA's Goddard Space Flight Center, as well as numerous groups in the Washington, D.C. area such as the Navy Research Lab and the OSSIM researchers who were involved with the effort.
SDSU's first source of Katrina data was the national center for data, the U.S. Geological Survey Center's Earth Resources Observation and Science (EROS) Data Center, in Sioux Falls, S.D. When Graham first accessed EROS to download the data and FTP it, the screen said, "Estimated time: 218 hours" for one data set. All the datasets needed were eventually downloaded to disk and FedEx'ed or flown to SDSU.
"It is clear that scientific visualization has reached a new and powerful level to model, predict and plan for natural disasters," said SGI's Bishop. "The Visualization Center at San Diego State University offers a clear demonstration of how this technology can be used to triumph over adversity."
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?