November 17, 2008
It’s funny how life works inside large software companies. For example, CA today announced a new push around virtualization management and cloud computing enablement, but this wasn’t necessarily a long-term strategy.
In fact, says Stephen Elliot, vice president of strategy for CA’s Infrastructure Management and Data Center Automation business unit, when the company decided to get aggressive around virtualization management, people were surprised with the capabilities that had been developed across the product portfolio. From backup/disaster recovery with XOsoft (and related VMware partnership) to mainframe support with Mainframe VM Manager to security with AccessControl, CA’s individual divisions had built in virtualization capabilities that, when taken as a whole, enable advanced management of the virtual layer. “We have some new products, but then we have additional capability where some of these key products have added, quite quietly, the ability to manage virtual machines and virtual infrastructure,” he explained to me.
What spurred CA’s movement to gets its “arrows pointed in the same direction” is the transformational effect virtualization is having on customers, and their subsequent inquiries into how CA fits in with the individual virtualization vendors’ management platforms. Elliot says the answer is easy: “It’s really the difference between how do you manage the platform … and how does it really move from the business management of the enterprise.” As customers hit virtualization tipping points, he told me, they need features like deep performance visibility and detail views of virtual and physical infrastructure.
CA believes its solution set offers end-to-end management of and visibility into enterprise infrastructures, with a prime example being the “great triumvirate” of CA, VMware and SAP. According to Elliot, “SAP is increasingly one of the top types of application workloads that customers want to virtualize,” and the business model whereby SAP is reselling CA Wily Introscope is facilitating this move. Customers don’t want six different vendor relationships just to manage the virtualization layer, Elliot told me, and CA (along with partner VMware) can help create a unified interface.
Of course, expanding your virtualization footprint isn’t all roses, so CA also helps customers with best practices, manage complexity and maximize ROI. The company also plans to expand multiple hypervisor support into more products, and Elliot says CA is carefully watching the inflection points where customers are choosing to utilize multiple virtualization platforms.
These services and future directions probably are a good thing, as Elliot doesn’t see any sign of a virtualization slow-down. “We haven’t talked to any customers who said, ‘Geez, I’m pulling back. I don’t want to put more on my virtual machine infrastructure,’” he told me. “It’s been all the opposite … where the question is now becoming ‘What shouldn’t we put on virtual machines?’”
One of the new products Elliot mentioned is Data Center Automation Manager (DCAM), which also happens to be the focal point of CA’s new cloud computing strategy. DCAM helps companies compress change management, automate provisioning and allocate resources on demand, but beyond that, CA’s cloud strategy isn’t too clear. To hear Elliot tell it, CA’s main goal right now seems to be getting feedback so it can produce the best-possible cloud computing, whenever that might be. “The last two or three quarters … even some of the larger enterprises are asking us what are we going to do for the cloud, how do we plan to manage clouds, how do we plan to consider what are the key requirements for the so-called cloud.”
Some of this interest probably relates to the world’s current economic woes, Elliot added. “I think this particular cycle, what we’re seeing here, is IT really looking at the different types of ways they can get their functionality to the customers -- whether it’s a cloud or those types of services, whether it’s software as a service, or whether they’ve got their internal IT staff that is really thinking more like an internal cloud or service provider.”
Finally, Elliot noted, CA is very cognizant of the bottom line as it moves into the clouds. Aside from getting customer feedback, he says, “It’s gonna be just an important for us to recognize what we should be doing as it will be to put development dollars into what we will be doing.”
Posted by Derrick Harris - November 17, 2008 @ 1:24 PM, Pacific Standard Time
Derrick Harris is the Editor of On-Demand Enterprise
No Recent Blog Comments
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?