May 02, 2013
While Intel has yet to invoke a clear spell around the interconnect magic it conjured from Cray and QLogic, the company seems to be getting closer to pushing the role of these acquisitions in its future high performance computing and big data plans.
On the heels of some recent revealing of where the almighty interconnect snaps into the future of both big data and HPC, today’s announcement of a new CEO to step to the helm, Brian Krzanich, adds further intrigue. New leadership might signal new attention to the core technologies that are powering big businesses—including Intel’s own processes.
Unlike his predecessor, Krzanich has a tick over 30 years under his Intel belt, and not just on the technology leadership front. The new CEO, set to take the role in mid-May, is an on-the-ground technologist who has toiled on the process manufacturing fronts for a good deal of his lengthy career with the chipmaker, meaning he’s had his hands dirtied with the mess of moving data to solve large-scale engineering challenges.
While it’s a bit too early to speculate on how influential his direct semiconductor manufacturing experiences will play into the larger goals of the company, especially on the consumer side, it’s heady news for the high performance computing market. Intel has been outfitted with a new leader who understands first-hand the practical challenges of advanced computing initiatives, the cutting-edge technologies that are going to power the next wave, and what these mean during this time of increasingly spicy competition among key silicon vendors.
At the core of Intel’s future in both the more enterprise-driven big data markets and the traditional HPC butter is the one knot that ties it all together—at least in the opinion of some of its leaders on the high performance computing and big data sides. This is the interconnect, of course, and with a leader installed who can rally his troops behind the critical components that can make fast chips unstoppable because he knows firsthand just where developments on this front can lead, Intel might be more inclined to boost efforts with its QLogic and Cray products more aggressively.
In other words, it’s about competitiveness—not just for their users, but for Intel’s future role. This week at the IDC User Forum in Tucson, Intel’s John Hegenveld (probably inadvertently) teed up this breaking news gently by talking about what big data, HPC and innovations on the interconnect front mean for their large-scale users.
Hengeveld says that whether it’s big data or HPC, the definitions will always be the subject of debate, but in the end, it’s the interconnect that will define both. In HPC, he says, it’s not just about processing—it’s connecting data to work against a large problem seamlessly. Big data is the same, but it’s not all about the analytics. At the core is moving data quickly and both problem areas are concentrated on removing the barriers to data.
Hengeveld explained how Intel drank its own big data Kool-Aid to analyze its process for validating parts by culling massive data around its chip validation process. In an effort to optimize this lengthy process, they used multi-sourced and multi-structured data to devise a faster analytic process based on incredible volumes of historical test data. By finding the needles in their own messy haystack, he claims they were able to boost their time to market by 25%.
At the core of these critical business processes refining them isn’t just raw data and some vague analytical process, it’s about the processing and data movement required to push it all forward.
To put this in context, he asked the audience to think of big data as an oil field. Under the ground is the end result—the fully visualized result of massive research and digging efforts. But what really matters in the process of extraction, he argues, is the technology that powers the drill. It’s about keeping that process lean and working among the many technology layers that back the tools to extraction.
But with so many different drills for so many markets, what's the most efficient way to sharpen the bit?
Again, all speculation aside, it’s hard to argue with the potential value of having a leadership structure in place that has worked with these data movement and analytics challenges for the manufacturing and processes behind chip production. How these play into the real-world larger marketplaces Intel fights in will remain to be seen, but it’s clear there’s a new, more practical technology focus that’s set to be ushered in for Intel.
To emphasize the renewed emphasis on practical technology, Intel’s other leadership announcement today was that of Renee James’ move into the role of president—offering her the distinction of the highest-ranking woman in Intel’s decades-long history.
An MBA and former technical assistant, then Chief of Staff guiding datacenter and chip manufacturing initiatives, James will bring perspectives from the company’s security, software, services and non-chip businesses to lend some insight on the larger business view for the new CEO.
It’s hard not to place bets when new CEOs take over with focused areas in once segmented parts of a technology’s giants business, but if I was a betting kind of gal….well, we’ll save that for another time.
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?