September 27, 2013
Sept. 26 — The Extreme Science and Engineering Discovery Environment (XSEDE) and the National Science Foundation Division of Advanced Cyberinfrastructure today released the XSEDE Cloud Survey Report with results of its survey identifying cloud computing use cases in research and education.
Data was collected from eighty projects from around the globe, representing a cross-section of cloud users from twenty-one science and engineering disciplines and the humanities, arts, and social sciences. Quantitative dimensions of cloud usage (number of cores used peak/steady state, bandwidth in/out of the cloud, amount of data stored in the cloud, etc.) and qualitative experiences (the benefits and challenges of using the cloud) were explored.
The report is available at http://hdl.handle.net/2142/45766.
· The top three reasons survey participants used the cloud were: (1) on-demand access to burst resources, (2) compute and data analysis support for high throughput scientific workflows, and (3) enhanced collaboration through the rapid deployment of research team web sites and the sharing of data.
· MapReduce was the most heavily used special feature offered by the cloud service providers, followed by access to community datasets.
· Application and programming models considered good candidates for the cloud were high throughput, embarrassingly parallel workloads; academic labs and teaching tools; domain-specific computing environments; commonly requested software; science gateways; and, real-time event-driven science.
· Cloud benefits identified by the survey participants were pay as you go, lower costs, compute elasticity, data elasticity, Software as a Service, Education as a Service, broader use, scientific workflows, rapid prototyping, and data analysis.
· Cloud challenges identified included the learning curve, virtual machine performance, variability in bandwidth, memory limits, database instability, private/public cloud interoperability, security, data movement, storage, and cloud computing cost and the funding availability.
· A more comprehensive and balanced cyberinfrastructure, i.e., a multi-level CI, is needed to support the entire spectrum of NSF-funded communities.
· Unlike traditional HPC workloads, many of the research and education applications surveyed required many cores rather than fastest performance per core.
· The challenges of using the cloud require continued investments in basic, applied, and experimental research.
· Investments that facilitate access to production cloud resources, cloud training, and cloud user consulting are needed as well, whether clouds are public, private, or national CI , or, more likely, some combination thereof.
· Although in their infancy, hybrid clouds hold the promise of enabling modest size private clouds used for steady-state workloads to burst to public, community, or national CI during peak workloads. Most private clouds are expected to become hybrid clouds in the future. The challenge will be implementing a management framework.
"This survey data will help the XSEDE management team develop a strategy for integrating cloud capabilities into national cyberinfrastructure," said John Towns, XSEDE PI and project director. "It will also help university administrators and computing directors envision what roles clouds might play at the campus level," he added.
The report authors are David Lifka (Cornell University PI), Ian Foster (Argonne National Laboratory/University of Chicago), Susan Mehringer (Cornell), Manish Parashar (Rutgers University), Paul Redfern (Cornell), Craig Stewart (Indiana University), and Steve Tuecke (Argonne National Laboratory/University of Chicago).
XSEDE is a virtual organization that provides a dynamic distributed infrastructure, support services, and technical expertise that enable researchers, engineers, and scholars to address the most important and challenging problems facing the nation and world. XSEDE supports a growing collection of advanced computing, high-end visualization, data analysis, and other resources and services. XSEDE is funded by the National Science Foundation.
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?