August 23, 2010
To read that one of the leaders at Autodesk Labs is not only broadly stating the company’s ambitions for Computer-Aided Design (CAD) and Computer-Aided Engineering applications in the cloud, but calling this movement “a disruptive change” signals that the traditional mode of CAD/CAE software delivery models might, like other software industry markets, be losing steam. While some can argue about the pace of such a march and what it implies about the next year (or even five years, for that matter) for the industry, the fact remains that an application segment that many have declared being non-cloud compatible is compelling news, if nothing else.
The general consensus is that CAD and CAE applications are not well-suited for the cloud for a number of reasons, not the least of which is the fact that such applications tend to require high-performance networks and superior GPU capabilities that the public cloud cannot accommodate. Still, there are a number of startups, including the undisputed master in the space, Autodesk, that are venturing into the cloud in order to find new ways to deliver their software product to designers and engineers.
As Beth Stackpole noted today in DesignNews, the cloud is viewed as being incompatible with the goal of “delivering the performance and interactivity required for data-intensive, graphically demanding CAD and CAE applications, especially when it comes to handling complex assemblies and larger models.” This is in addition to a host of other perceived technical barriers in addition to the security concerns that typically arise when matters of losing “control” over processes comes into play.
Brian Matthews, Vice President of Autodesk Labs told DesignNews that many “are focusing on the cloud to do the old method better, cheaper and faster, but the real implication is to do what couldn’t have been done with the traditional model.”
Matthews went on to note that Autodesk is boosting its efforts to bring more of its products to a cloud-ready state and that the beauty of the model is that “with the cloud, you can ask for 10, 100, or even 1,000 CPUs and rent them for minutes, seconds or hours and you don’t have to buy a supercomputer. Not only do you get answers much more quickly, you end up asking more questions…so the machine can give you the optimal answer to your optimization rather than an acceptable. This is a disruptive change.”
Full story at DesignNews
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?