May 31, 2010
IDC presented its market forecast this morning at ISC, which yielded a few surprises, a few non-surprises, and of course, an amazing amount of data. Since the HPC group tends to focus on large-scale movements and trends in HPC specifically, the topic of cloud computing in the space did emerge although was generally only touched upon. The presentation did make note of successful HPC and cloud projects, including CERN’s drive to develop the world’s largest private cloud for the distribution of data, applications and resources for scientists and also mentioned NASA’s Nebula, the NSF and Microsoft partnership, and NERSC’s collaboration with the Joint Genome Institute. It will not come as a surprise to see far more news on the cloud front in upcoming years from IDC since this trend is becoming more difficult to overlook.
While there will be a more thorough overview based on the presentation given this morning after we’ve had time to more closely analyze the results, there are some points that seemed worth repeating about the future of HPC—well, at least the future as it will arrive sometime between tomorrow and 2014.
The following are some general bullets gleaned during the talk this morning:
• Computational modeling, simulation and design has been established as the third branch of scientific inquiry and HPC is a critical element in these capabilities. Although IDC did not discuss this element, these are areas where we are seeing some movement cloud-ward, thus the alignment here is very noteworthy.
• Along with the point above, in terms of overall HPC server revenue, Biosciences are at the top of the purchasing list, following closely by CAE, Chemical Engineering, DCC and Distribution and Financial Services. Again, pushing general processing out into the public cloud or handling it internally on a private cloud has been a hot issue in the life sciences as of late. For more on that, check out the several posts on the topic by Bruce Maches.
• In addition to the point above, HPC-driven innovation is a prerequisite for scientific and industrial leadership, economic advancement and national development goals. HPC can literally change a nation’s wealth.
• Petascale early science projects will tackle national and global problems; we have already entered the petascale era and are heading toward exascale computing.
• IDC also forecasts significant growth in the HPC interconnects markets, anticipating growth from its current level at $2b to $2.5b. Almost doesn’t sound like much growth when you casually glance at those numbers, but those are billions we’re talking about here.
• X86 processors will continue to dominate, but GPGPUs will gain traction as x8 hits the wall.
• Supercomputers have become a major part of the market again. Although the Petascale era is just dawning, governments are already exploring exascale.
• Some general roadblocks for HPC? “Power and cooling infrastructure limitations are the biggest barrier to increasing HPC resources” and also generally, “software has become the #1 roadblock for HPC clusters” as they are hard to set up and operate, hard to implement with new years, lacking in parallel software, etc.
• With the costs of “live” science and “live engineering” costs are on the increase—HPC will need to continue to innovate in the cloud space, applications will simply need to be rewritten. We are on the verge of a new paradigm in HPC.
More details about the IDC briefing to come later as the materials are released and reviewed. Even though a majority of the discussion centered around traditional HPC, the amount of cloud discussion was surprising. It will be of great interest to “compare notes” between this year and next to see how much more (if at all) the conversation will be dominated by cloud discussion and projections for HPC.
Posted by Nicole Hemsoth - May 31, 2010 @ 3:25 AM, Pacific Daylight Time
Nicole Hemsoth is the managing editor of HPC in the Cloud and will discuss a range of overarching issues related to HPC-specific cloud topics in posts.
No Recent Blog Comments
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?