May 18, 2009
Welcome to the era of the computational Web. Wolfram Alpha, Stephen Wolfram's online "computational knowledge engine" debuted on Friday evening, and was officially launched on Monday on its Web site. Wolfram is calling it the first killer app for his universal computation paradigm he developed seven years ago.
Unlike Google or other Internet search engines, Wolfram Alpha focuses on what comes most naturally to computers: crunching numbers. Using Mathematica as the software foundation, the engine applies over 50,000 types of algorithms across more than 1,000 knowledge domains. The query syntax is fairly straightforward and forgiving, and suggestions are provided if you manage to stump the input parser. The results are displayed in a number of useful ways -- graphically whenever possible.
Users can ask for operations as diverse as comparing two publicly traded companies, figuring out the mileage between two cities, blending colors, and finding some interesting facts about your birthday, or playing a major scale. Not surprisingly, it can also handle straight math problems like multiplying matrices, computing a derivative, or factoring a polynomial expression.
Unlike vanilla database Web applications, Wolfram Alpha employs HPC clusters for its computational hardware. At launch time, the application had access to about 10,000 x86 CPUs spread across five datacenters. The largest cluster is R Smarr, a 40 teraflop (Linpack) Dell machine owned and operated by R Systems Inc. The system consists of 576 dual-socket quad-core Harpertown servers, hooked together by DDR InfiniBand. R Smarr currently sits at number 66 on the TOP500 list.
During its weekend debut, Wolfram Alpha was processing queries at the rate of between 80 to 120 per second, although not flawlessly. According to Wolfram Research co-founder Theodore Gray, about 70 percent of the queries were successful. The rest prompted the user to frame the question somewhat differently, usually with some suggestions. "I think that that's a pretty good ratio considering that these are just people from the wild coming in without coaching on what ought to work," said Gray.
A smaller number of queries delivered incomplete results or a "computation timed out" message. Gray claimed that this was due to some faulty servers, rather than a load issue. Supposedly, the current setup is able to handle thousands of queries per second.
On the other hand, not all queries are created equal. For example, computing a Haferman fractal is no problem at 5 iterations (the default), but if you specify the same fractal at 7 iterations or greater, the engine just chokes.
The initial Wolfram Alpha database is said to contain over 10 trillion items of curated data, with live feeds being used to provide continuous updates. Despite that effort, it wasn't too difficult to find some holes. For example, while Wolfram Alpha correctly computes that red + yellow produces orange, it's also is under that impression that blue + yellow produces gray. Finally, some data is just missing, e.g., it can answer why the sky is blue, but not why grass is green.
Obviously, one of the biggest challenges for Wolfram Alpha will be to fill in missing data, while keeping it coherent and accurate -- no small task (just ask Wikipedia). Given the exponential growth rates of information, manual curation can only do so much, so presumably they will have to find ways to automate the process or rely on third parties to deliver domain-specific data.
Despite some of the limitations, the computational model is quite powerful. A lot of the functionality in Wolfram Alpha previously existed in other online computational applications (currency converters, calculators, geomapping services, gene mapping. etc.), but up until now there was no common framework that brought them together. If successful, an online tool such as this for general-purpose computation could change the Web landscape.
The business model remains a question. The Web site is free to the public, and there are currently no advertisers on the site. According to the Wolfram Alpha FAQ, they initially intend to go after corporate sponsorships and customized business deployments. In the latter case, they're looking for companies who can apply the technology to internal databases via a Wolfram Alpha API. Longer term, they may look for targeted advertising. If Wolfram's dream of universal computation comes true, he'll have more business than he knows what do with.
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?