June 06, 2013
Visualizing 3D and 4D environmental data is necessary for greater understanding and prediction of environmental events.
Researchers from the Center for Intelligent Spatial Computing and the University of Denver are trying to better grasp how they can harness both CPUs and GPUs together to speed a sample geovisualization process using dust storms as the subject..
By visualizing these storms, the team was able to develop a 3D/4D framework for geovisualization that includes everything from preprocessing, reprojection, interpolation and rendering.
While the CPU was an important component of their initial project, especially in terms of preprocessing the data that couldn’t be held in the GPU’s on-board memory, GPUs presented a higher performance and more efficient solution than CPUs.
They were then able to compare the performance differences between GPUs and CPUs. Their findings revealed that multicore CPUs and manycore GPUs can improve the efficiency of calculations and rendering using multithreading techniques. They also found that given the same amount of data, when increasing the size of blocks of GPUs for a coordinate transformation, the executing time of the interpolation and rendering is consistently reduced after hitting a peak.
The team also concluded that the best performance results obtained by GPU implementations in all the three major processes are usually faster than CPU-based implementations, although the best performance with the rendering component is similar between GPUs and CPUs.
On the memory front, they note that the on-board memory of the GPU limits the capabilities of processing large volume data, thus they needed to do preprocessing on the CPU. Still the efficiency of their project was hit by the relatively high latency of the data flow between GPU and CPU.
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?