May 05, 2011
One of the most fertile research areas to dig for solid cloud computing use cases lies in the life sciences.
This week an article in Scientific American looked at the ways scientists in this field are making use of the cloud. While the author points to a number of case studies highlighting the value of Amazon’s cloud in particular, there was also some discussion about the current limitations of cloud computing resources for biosciences researchers.
The author spoke with Giles Day, the managing director of cloud computing at San Francisco-based Distributed Bio, which provides informatics consulting services to a range of life sciences companies. In his view, clouds are not always an appropriate choice for some clients. As he told Larry Greenemeier:
“Let’s say you’re producing terabytes of data that takes a relatively short amount of time to compute…In that case, you’re going to spend an awful lot of money and time shifting data into the cloud to gain a very small reward on the actual compute time.”
This particular problem is one that many outside of the life sciences are running up against, especially in research computing where the problems generally involve similar data sizes. To work around this issue, Day suggests that a hybrid model of cloud computing tends to work the best, wherein researchers keep some aspects of their data-intensive work in-house whereas other data that is more manageable can be shipped off to the cloud, thus freeing up vital physical resources.
As Day stated, “The perfect scenario for using the cloud in biotech is to outsource small amounts of data into the cloud that require a massively parallel computing system for processing and then have the results of that processing returned.”
As for those massive bandwidth bottlenecks, Day reminds researchers that despite modern technology, we still can’t break the laws of physics.
Full story at Scientific American
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?