May 13, 2013
Update: D-Wave system coming from Google and NASA ... Read More
Supercomputing veteran Bo Ewald has been neck-deep in bleeding edge system development since his twelve-year stint at Cray Research back in the mid-1980s, which was followed by his tenure at large organizations like SGI and startups, including Scale Eight Corporation and Linux Networx.
As we reported earlier this month, Ewald is stepping into yet another new role, this time at the helm of the first quantum computing company, D-Wave Systems. During our recent conversation, Ewald confirmed his belief that quantum computers will be at the heart of a new wave of computing—at least for a certain set of specific optimization, machine learning and pattern recognition problems.
“This is the early days, almost like when the first Cray 1 or Thinking Machines systems came out,” Ewald reminisced. The same skepticism, scientific and business practicality questions, and the same promise exists, he argues.
D-Wave has been in development for 14 years, and has finally arrived at a commercialization opportunity to pitch from its new office in Palo Alto. With a recognizable name like Ewald front and center, it's clear the company sees opportunities outside of its one public customer, Lockheed Martin. Ewald said he researched heavily to validate the commercial viability and will lead D-Wave's charge into defense and intelligence, research, and other potential markets. The catch, of course, is that organizations need to have a spare $10 million or more and the right physics and math pros to tap into the programmatic possibilities.
Like the historical systems mentioned above, the company’s flagship system, the D-Wave One, was greeted with equal parts intense skepticism and excitement. With some highly publicized demos and a customer case under their belt, D-Wave thinks it can find a solid market for its 128-qubit processor-based technology, which comes wrapped in its own cryogenic and quantum-balanced unit pictured left.
The company will face a lengthy battle against perception that these quantum computers are fringe or merely experimental. However, some researchers, including Dr. Catherine McGeoch, Beitzel Professor in the Computer Science department at Amherst College, are validating performance claims. For a particular range of applications, quantum vastly outpaced conventional computing. And their work at the USC-Lockheed Martin Center for Quantum Computing continues to offer some serious credibility for, again, a certain class of optimization problems.
As with all early-stage innovations in computer science, there is a major programming and software ecosystem gap. Ewald says this is really no different than what happened with GPUs. He argues that if one thinks about the code and partitioning problems that were present with those accelerators before the software tooling was there in spades, the same story will play out. At this point, mathematicians and physicists can construct their problems numerically using the handful of tools at their disposal and then map them onto the quantum machine.
“We’re on the edge of something revolutionary,” he explained. “This is far different than traditional scientific computing and high performance computing, which is numerically intensive—it’s about crunching a lot of numbers.”
For a range of optimization problems, however, where calculating using the standard set of ones and zeros results in incredibly slow and complex equations, quantum computing relies on “mapping an optimization problem onto the quantum computer so it can instantaneously, once it reaches the quantum state, give you a better solution than the one you started with. With multiple iterations, it will arrive at the best possible answer."
To put optimization problems into a “normal” context, imagine the following, very common scenario. There is a massive snowstorm in Chicago, which has caused grounding of an unprecedented number of flights. Airlines need to be able to quickly figure out the very best possible solution to moving planes and crews around to adapt. A few iterations on the D-Wave One, says Ewald, and there it is.
Sounds almost too good to be true. Well, there are some catches—the simplest to see is the mere complexity of the quantum process. Further, there's the programming for these select optimization, machine learning, and pattern recognition problems.
Take a look at the photo on the left to see the inner workings of one D-Wave’s deep freeze boxes. Outside of using atoms rather than bits to solve some of the most perplexing problems in computer science, there are other elements that make D-Wave’s technology noteworthy. While Ewald couldn’t discuss details, he said the real challenge that all the years of R&D have been tackling lies in getting the qubits—the quantum bits—to engage in a way where they become entangled. At this point, the system will move to a lower energy state but there are tough hurdles to create those conditions.
The qubits need to exist at near absolute zero in terms of temperature, vibration and magnetism must be eliminated, and it must operate in a perfect vacuum. That’s a tall order, but Ewald said that the science is there and the applications are real. D-Wave has managed to create this environment to the point where they can get up to 500 qubits into a quantum state.
But theory aside, who will be installing a multi-million dollar ($10 million and up) D-Wave One in the next few years, especially at a time of crunched budgets? Perhaps the best advertising mechanism the company has lies in its work with Lockheed Martin. While they haven’t been overt about what problems they’re using their D-Wave setup for, the USC-Lockheed Martin Center for Quantum Computing has been very vocal about their belief in the future of quantum computing.
Lockheed took care to stress the importance of optimization problem solving--finding the best possible answer in a sea of possible answers--which means that's where their interests likely lie. Government, intelligence and industrial uses remains unclear, but Ewald says that new uses and use cases for these systems will emerge in all areas typically reserved for HPC, including financial services, oil and gas, life sciences--the usual suspects.
“This type of computer is not intended for surfing the internet, but it does solve this narrow but important type of problem really, really fast,” said Dr. Catherine McGeoch. “There are degrees of what it can do. If you want it to solve the exact problem it’s built to solve, at the problem sizes I tested, it’s thousands of times faster than anything I’m aware of. If you want it to solve more general problems of that size, I would say it competes – it does as well as some of the best things I’ve looked at. At this point it’s merely above average but shows a promising scaling trajectory.”
For now, D-Wave stands alone in an emerging market, in much the same way Cray was the monolith at the beginning of the era it kicked off. Ewald is in the unique position of having been at the forefront of one disruptive event in technology, while rounding out his long career leading another such transition.
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?