August 05, 2010
High frequency trading (HFT) is back in the news, and as seems to be the trend lately, it has become the recipient of yet more criticism. For those of you who invest in the stock market in the old-fashioned way (i.e., via your broker or through your retirement plan), HFT is a type of algorithmic trading that uses high-end computers, low-latency networks, and cutting-edge analytics software to execute split-second trades. Unlike long-term investing, the strategy is to hold the position for extremely short periods of time, the idea being to make micro-profits from large volumes of trades. In the US, it is estimated that 70 percent of the trade volume is executed in the HFT arena.
The recent news relates back to the 1,000-point stock market bounce -- the so-called "Flash Crash" -- that occurred on May 6. At the time, there was plenty of speculation flying around (including some by me) that HFT was involved in one way or another. But some recent detective work by market analysis firm Nanex suggests that questionable behavior by these HFT systems may have had a more direct role in the market chaos back in May.
Alexis Madrigal's article in The Atlantic gives a layman's account of what the Nanex techies uncovered with some clever algorithmic sleuthing. The Atlantic piece, which I came across by way of a related article in Ars Technica, points out the HFT shenanigans were uncovered when Nanex computer engineer Jeffrey Donovan went beneath the covers to look at trading that day in millisecond-level timeframes, a level of granularity that never shows up on stock charts.
What Donovan found was evidence of "quote stuffing," a term that refers to the practice of sending large volumes of bids -- on the order of hundreds or thousands a second -- without the intent of executing a trade. For example, on May 6 there were hundreds of times that a single stock was getting a 1,000 bids per second. Since the originating algorithm knew these were false bids that would never be filled, the implication is that it would have an edge on its clueless competition. Madrigal writes:
Donovan thinks that the odd algorithms are just a way of introducing noise into the works. Other firms have to deal with that noise, but the originating entity can easily filter it out because they know what they did. Perhaps that gives them an advantage of some milliseconds. In the highly competitive and fast HFT world, where even one's physical proximity to a stock exchange matters, market players could be looking for any advantage.
The unfortunate side effect of quote stuffing is that it tends to destabilize the market, presumably as a result of so much "false" information being injected into the system. Worse yet, Nanex found this type of algorithmic spoofing was not just a one-time event that corresponded to the May 6 crash. Apparently, this behavior was (and is) going on systematically. They have uncovered dozens, and perhaps hundreds, of times on any given day when these unexplained quote bursts occur.
The HFT arena is certainly one area in which the technology has evolved so rapidly and with such little transparency and accountability that it seems to threaten the system it was designed to serve. In particular, legacy institutions like the SEC and the exchanges themselves seem to be having a hard time grappling with the consequences of supercomputing and low-latency data feeds. In the Ars Technica article I referred to above, author Jon Stokes notes that even the traders themselves recognize that they're no longer in control:
Informal conversations I've had with money managers and traders indicate that in the wake of the Flash Crash, even the insiders are scared of what the markets have become. Still, they have no choice but to keep trading—not only must they keep trading, but everyone is quietly stepping up their automated trading efforts to avoid getting eaten alive by their competitors' machines. It's a bit like The Sorcerer's Apprentice, except that there's no master magician who can step in and save us in the end.
Posted by Michael Feldman - August 05, 2010 @ 8:30 PM, Pacific Daylight Time
Michael Feldman is the editor of HPCwire.
No Recent Blog Comments
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?