May 15, 2008
If you read just one HPCwire article this week, be sure to catch John West's profile of the National Visualization and Analytics Center (NVAC). The center is developing visual analytic tools for its sponsor, the U.S. Department of Homeland Security (DHS). The work is particularly interesting because it fits into the category of "Edge HPC," Tabor Research's term for HPC that lies outside the traditional science and engineering realm. Certainly, the DHS mission of protecting the nation from terrorist attacks is one of the edgier missions in the U.S. government.
According to NVAC Director Jim Thomas, "Visual analytics is the science of analytical reasoning facilitated by interactive visual interfaces." It combines visualization with human factors, information, geospatial data, and scientific analytics. An example is gathering multiple news streams from different media (television, radio, online) to uncover some trend, say, the global demand for rice, and create an interactive visual model of that trend. Since NVAC is sponsored by the DHS, the applications of interest run more toward border security, customs control, national security, disaster planning, crisis management, and the like. In any context, visual analytics promises to be one of the key technologies for dealing with the data deluge everyone is talking about.
But the NVAC work only represents a tiny slice of the Edge HPC world. In a recent article in HPCwire, Tabor Research analysts offer a breakdown of this new category. It includes applications such electronic trading, real-time supply chain management, multimedia data mining and searching, and online gaming. Yes, Edge HPC is a big tent.
The TR article states "[T]his 'Edge HPC' market is currently generating significant revenues and has strong growth potential. Over time, we expect it to exceed the [traditional HPC] market due to the scope of domains it will impact."
If true, I wonder how today's HPC vendors will tap into it? The theory is that traditional HPC technologies can be leveraged for Edge HPC. But that's going to take new product lines, new software stacks, and experience with user domains outside conventional high performance computing. The diversity of applications suggests we'll see more specialized vendors (or specialized groups within larger organizations) targeting specific Edge applications. In some cases, that will mean generic HPC platforms will evolve into HPC appliances. We're starting to see some of that today in online gaming and electronic trading.
The richer ecosystem encompassed by Edge applications might end up transforming HPC at least as much as the shift from supercomputing to commercial HPC did over the previous decade. If it all comes to pass, we'll barely recognize the market in a few years.
Posted by Michael Feldman - May 14, 2008 @ 9:00 PM, Pacific Daylight Time
Michael Feldman is the editor of HPCwire.
No Recent Blog Comments
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?