June 22, 2007
... or is it the other way around?
If you want to know where high performance computing is headed, just follow the money. In particular, look at how aggressively Wall Street is applying advanced computing infrastructure in their quest to expand profits. A Microsoft-sponsored capital markets survey released this week showed that nearly 80 percent of the industry respondents said they would be expanding their HPC capacity in the next 12 to 18 months. Microsoft cites the demand for faster financial data analysis as a driver, but the rapid increase in volume of market data is also fueling Wall Street's interest in high performance infrastructure.
The relentless downward spiral in the cost of computing hardware is helping to make this build-out possible. But almost half of the survey respondents reported that performance was more important to them than price. This explains why the financial industry always seems to be the first one in line when products based on newer technologies like the Cell processor, GPUs, or stream computing are announced. That said, most Wall Street firms are still using vanilla clusters to drive their financial analytics and modeling codes. And 24 percent of the respondents said they plan to increase their HPC capacity by 1,000 nodes or more by the end of 2009.
"This research confirms what we've been witnessing in the market -- that capital markets firms remain on the cutting edge, and that the dot-com bust of the early 2000s has now turned into a period of reinvestment for firms seeking technologies to help them grow," said Craig Saint-Amour, director of capital markets solutions in the U.S. Financial Services Group at Microsoft.
But it's not just hardware. Software is at the center of the revolution going on in the financial service industry. Algorithmic trading is all the rage in Wall Street. Using low-latency exchange data feeds that can deliver up to a million messages per second, algorithmic trading platforms are proliferating. As a result, the number of human traders is plummeting at the same time the number of actual trades is skyrocketing. According to Aite Group LLC, a consulting group for the financial services industry, the share of algorithmic trading was about one third of the total equities trading volume at the end of 2006. By the end of 2010, they estimate that approximately half of all equities trading will be done through algorithmic trading.
And as more firms automate their trade processes, the competition to deploy the smartest, fastest trading software is escalating. Financial firms are scrambling to hire quantitative analysts, or quants, the computer science geeks who devise killer algorithms for financial analysis applications. The idea is not just to outperform traders, but outperform the competition's software as well. For some classes of transactions, even a millisecond interval can mean the difference between profit and loss. In this type of environment, mere mortals don't have a chance.
"I don't think traders are going to disappear, but if you look at salaries on Wall Street as one indicator of the future, there are quants there now that are being paid multi-million dollar bonuses," observes Kevin Pleiter, Director, Global Financial Services Sector at IBM. "These are the ones generating the millions and millions of dollars of profit for these firms."
As trading algorithms evolve, higher levels of intelligence are being built into them. In a recent article at Bloomberg.com, Jason Kelly writes about how the next generation of quants are working toward incorporating artificial intelligence into their codes. For example, the quants are looking into using natural language processing to extract information from news reports and correlate that information with its effect on financial markets. The idea is to mimic human-like intuition, but do so at the speed of the microprocessor. Some of the challenges are immense, but the motivation to take the human out of the loop is just as large.
If you think Kelly's article describes a world of the distant future, you may be surprised to learn that commercial solutions are already emerging. This week at the SIFMA Conference, IBM previewed a software framework that could enable these next generation trading applications. The framework, called System S, provides an enterprise stream processing environment that encapsulates and manages real-time analytics applications. Although IBM steers clear of the AI nomenclature, System S is clearly meant to appeal to those looking for more human-like analysis in their software. Not surprisingly, IBM's first target for this technology is Wall Street. (Our feature article in this week's issue takes a look at how System S works.)
While these super-intelligent financial analytics codes may be fighting it out between each other in the not-too-distant future, apparently you don't necessarily need software of this caliber to beat the traders. Even using today's technology, the quants' algorithms have done a decent job of humbling the humans. As noted by Kelly in the Bloomberg article:
"The computers have done well. A November 2005 study by Darien, Connecticut–based Casey, Quirk & Associates, an investment management consulting firm, says that from 2001 to '05, big-cap U.S. stock funds run by quants beat those run by nonquants. The quants posted a median annualized return of 5.6 percent, while nonquants returned an annualized 4.5 percent. Both groups beat the Standard & Poor's 500 Index, which returned an annualized negative 0.5 percent during that period."
The shape of things to come?
As always, comments about HPCwire are welcomed and encouraged. Write to me, Michael Feldman, at email@example.com.
Posted by Michael Feldman - June 21, 2007 @ 9:00 PM, Pacific Daylight Time
Michael Feldman is the editor of HPCwire.
No Recent Blog Comments
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?