HPC Matters is a joint blog consisting of contributors from the Tabor Communications team on their observations and insights into HPC matters.
August 14, 2008
If one were to categorize the enterprise software market as mature, robust, innovative -- and definitely 21st century -- as it races headstrong into the cloud -- how would one categorize the HPC software market? Not to throw stones, but you could easily put it in the circa 1980 timeframe and use terms like immature, cottage-like and definitely lacking investment. When you mention HPC to any of the venture guys, they run for the hills.
I can rewind 20+ years to when I first entered this market and the conversation has not changed. No software, no money for software, and a continuous "humm" over the impracticalities of building software for an ever increasingly complex set of platforms. The dialogue never seems to evolve beyond parallel programming, languages (Fortran and C -- of course), open source and the vertical application specialists who seem to own the scarcest commodity.
Now mind you, there is an implicit expectation that the government should invest in and drive the initiative. Not only is the conversation focused in the wrong direction, but we are not even asking the right question! In my opinion, the issue is much more complex. Don't get me wrong. These are not absolutes, and we certainly need to get very real and focused on solving the challenges associated with programming models, as well as building robust middleware and tools. The big question, however, is around productivity, not platforms; and how do we make high performance systems fit seamlessly into an overall IT environment? For me the question is "where is SAP for the rest of us?"
If productivity is the "uber-trend," are we focusing on the right issues? A quick analysis of the numbers tells a very interesting story. The majority of the market is comprised of smaller clusters -- not extreme levels of parallelism. Yes, multicore will make things more complicated. But the heart of the market (the sweet spot) is in the midrange and in the industrial sector -- broadly defined. This is where the growth is, and this is where customers need help.
What is interesting to me is that after all of these years, we have not found the "common thread" that links all of these segments. An HPC "ERP" equivalent, if you will. Maybe, we haven't looked! Workflows in product development are pretty similar. Supply chains are complex and growing more complex; and again, have common attributes. Data volumes and the management and use of that data are becoming gigantically difficult. I can't begin to count the number of users who ask why we don't have some sort of application framework that enables applications to "speak to one another." I dug into my personal archives to find some anecdotal statements from discussions I have had recently.
The list goes on....
My apologies to those who have innovated on this front. PTC (Parametric Technology Corp.) and Accelrys immediately come to mind -- but these attempts are highly verticalized. If you are in the manufacturing segment, you are in luck. Between PTC and Dassault there are solutions. PTC has found the "killer app" in the manufacturing space -- PLM. But why hasn't that translated to other segments?
I've heard all the arguments, ranging from HPC applications are too niche; there is no demand; to the lack of common horizontal applications. But again they miss the point that there is a great deal of commonality within and across engineering and scientific workflows. Common requirements exist around data transparency, data analysis, and data management. There are also requirements for integrated applications that provide consistency and efficiency between elements of a workflow, across an organization or beyond the borders of a corporation.
Maybe it is a matter of timing. HPC is hot right now and getting hotter every day. From my vantage point, this screams of opportunity. Someone, please jump in!
Posted by Debra Goldfarb - August 13, 2008 @ 9:00 PM, Pacific Daylight Time
No Recent Blog Comments
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?