In the upcoming October digital issue of Advanced Trading, we explore how the buy-side-trading community is responding to the threat posed to their business by rogue algorithms in the aftermath of Knight Capital Group's costly error. One head trader at a firm with more than $200 billion under management told Advanced Trading that the buy side generally puts a lot of blind faith in the tools provided to them by their brokers, almost to the point of complacency.
But Knight Capital's nearly fatal fall has proven to be a wakeup call. Looking ahead, traders and industry experts say the buy side will need to take a closer look at the protections their brokers have in place in the event that something goes awry. Analysts say more firms are taking a closer look at the technologies being used by their brokers, and how their orders are being placed.
In researching this story, Advanced Trading spoke with Jason Scharfman, an expert in hedge fund operational due diligence, and managing partner of Corgentum Consulting. As part of the broader story that will run next month, Scharfman breaks down why rogue algos will never fully go away, what the buy side must do to protect themselves, and why self-policing from the industry, not additional regulations, are what's needed here.
Advanced Trading: How much of a problem have rogue algorithms become, and what are the chances of a repeat of the Knight Captial scenario if there isn't an industry-wide fine tuning of best practices around algo trading?
Jason Scharfman, Corgentum Consulting: You have to consider the magnitude of the problem. There's algorithms that crash all the time or models that break or don't perform the way they're supposed to. In different markets you'll see the suspicion of mini-crashes in certain names, or certain types of positions. But the way these algorithms work, it's unclear exactly if it's a model trading or if it's a human taking an active position in something.
Because you don't know that in some cases. You'll say, 'well it looks really suspicious – it's high frequency. It's probably a model doing something.' And then you have to ask, is this actually supposed to be happening, or is this an error in the model? I think it's very likely something like this could happen. Now whether it can happen to the this scale is a different story.
The SEC is working on a limit up/limit down rule that's going to go into place later this year. It's almost like a circuit breaker. That kind of stuff will work to control it. But I don't think you can ever eliminate it.
Advanced Trading: Are more regulations the answer to the problem of algorithms spinning out of control and dislocating the market? Or do market participants need to self-police this?
Scharfman: I think it's both. We'll do work with investors who are looking at hedge funds who utilize these strategies. It's really where you have to say, okay, the regulators are going to attempt to prevent catastrophe situations, but the responsibility really is on investors - the industry - to police this type of thing. Because the regulators don't have the resources or the transparency to catch it before it gets out of control. They show up after it's too late in most cases. The responsibility really is with the investment community.
Advanced Trading: Is the buy side now being more proactive about trying to develop customized algorithms and not just take what's off the shelf as is?
Scharfman: It's a combination. They're always going to start with some off the shelf, or prebuilt base or core. But I think there's definitely more and more customization being put into that so it complies, or fits better with their existing systems, and also to try and generate additional returns based on whatever they're seeing.
Which is interesting when you think about it because even though algo trading is computer-based, a human writes the code even though a human is not doing the trade. That's where you get into things like establishing procedures for how these things will trade.
But I think that there should be a focus on – and this is an area that's really ignored in terms of best practice – if there's a new piece of research, or a new piece of code or formula that's going into the algorithm, how is that tested? How does that get rolled out into the algorithm? Are they running that in any sort of simulated environment, or on kind of a lower-risk dollar level before it's being full blown out?
Because somebody could think they have a great idea and formula, put it in the market and cause big problems.
As the Senior Editor of Advanced Trading, Justin Grant plays a key role in steering the magazine's coverage of the latest issues affecting the buy-side trading community. Since joining Advanced Trading in 2010, Grant's news analysis has touched on everything from the latest ... View Full Bio