January 17, 2012
If 2011 was the year that cloud computing gained public acclaim as a revolutionary new way to use computing and storage resources more efficiently, 2012 may be the year that the IT industry's predominant focus on the public cloud begins to give way to a greater appreciation of hybrid cloud computing.
Public clouds – which allow multiple users to share massively scalable arrays of servers, storage and other resources through the Internet for maximum cost savings – dominate the current cloud computing market. Most enterprises, however, still rely largely on either internal or externally hosted private clouds that reserve computing resources exclusively for their own organization.
Hybrid clouds, on the other hand, combine aspects of both these models, providing the control and legacy compatibility of dedicated internal resources where needed, while shifting other less-demanding or less-critical applications to the off-site cloud. Hybrid clouds, as I view them, can encompass not just a blend of internal and external computing resources, but also both dedicated physical servers and virtual servers. This inherent flexibility can make hybrid solutions a good choice for adding new capacity or capabilities to existing legacy systems while still gaining some of the cost and scalability benefits of the public cloud.
Amazon Web Services, with revenues approaching $1 billion a year, has become far and away the industry's largest cloud services provider while concentrating almost exclusively on the public cloud. Yet Amazon, even when combined with a host of lesser-known public cloud providers, is barely scratching the surface of cloud computing’s ultimate market potential.
While the public cloud market is probably at least 10 times larger than the hybrid cloud market today, I expect the hybrid cloud to begin closing that gap. It wouldn't be surprising, in fact, to see hybrid implementations reach 50 percent or more of the total market as mainstream enterprise users grow increasingly serious about moving more of their applications and infrastructure to the cloud.
Enterprise cloud concerns
The truth is, most enterprise computer users aren't willing, or able, to entrust their mission-critical applications to the public cloud. Their reluctance is partly due to concerns about security, reliability and regulatory compliance in a shared computing environment. Equally important, however, is the realization that there are some things the public cloud simply won’t do.
The problem is that while some applications run just fine on virtual servers, others don’t. If you’re using multiple virtual machines (VMs) or sharing servers between multiple users, you run the risk of one VM using too much of the computing, storage or bandwidth resources. And when the remaining users must vie for a small, fixed pool of resources, applications are prone to crash or perform poorly.
Unfortunately, users in a public cloud environment simply don’t know what demands other users are going to place on their shared computing resources. As a result, enterprises often don’t know whether their applications will even work in the public cloud.
My experience shows that some applications – particularly large enterprise resource planning databases, accounting systems and other I/O-intensive applications – require the full capabilities of a dedicated hardware system. That’s why offering both physical and virtual dedicated server hosting, in addition to a full range of public and private cloud services makes sense. I have learned that when it comes to building and operating effective cloud-based systems, “one size fits all” is often not the best approach.
There are situations where it makes sense to keep at least some data and IT resources in-house. But there clearly are significant benefits to be gained from moving web servers, storage and other easily managed functions into the cloud. Just remember that whenever you’re ready to venture beyond your own data center and into the cloud, there are a wide variety of options available.
Start by identifying your specific needs and objectives, and then consider what makes sense to move into the cloud and what doesn’t. And unless you’re ready to make a major change, be sure to choose a solution that gives you the flexibility to run all the applications you already have.
About the Author
Denoid Tucker is Vice President of Technology for StrataScale, Inc., a Sacramento, Calif.-based provider of public and private cloud servers, dedicated servers and hybrid hosting services.
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?