Without the appropriate protocols and processing infrastructure in place, algorithmic trading can be challenging. Banc of America Securities' managing director, Rob Flatley, discusses the back end of algorithmic trading with InformationWeek's Steven Marlin.
Q: Has the FIX protocol fulfilled its promise of reducing manual steps in trade execution?
FLATLEY: Not entirely. The industry group that provides that common language, which is FIX, hasn't devised standards for algorithms. The major sell-side firms all employ their own specification for VWAP. So a lot of translation has to take place because everyone has developed a specification for a single order type, and it's a mess. It causes a lot of delays in the ability to execute trades.
Another issue is a lack of interoperability among order management systems, routing networks and FIX engine providers. So the buy side ends up having to manage a three-pronged relationship with all sorts of interdependencies; [it's] left with disjointed layers - that's created a lack of usability for what should be a straightforward protocol.
Q: What kinds of computational resources are needed to perform algorithmic trades?
FLATLEY: As a broker-dealer, you need to be efficient front to back for the entire securities processing cycle. Suppose you receive an algorithmic order to trade 100,000 shares - the algorithm slices that order into an average order quantity of 120 shares. That means you may get 1,300 different executions, which puts a lot of stress on downstream systems. Multiply that by 10,000 other orders coming in at the same time, and you have an idea of the computational complexity.
Not only do you need to be efficient in the front office, you also need to be efficient in the back office, as well. There's a whole process of affirmation, confirmation and reconciliation that occurs after a trade.
For more with Rob Flatley, visit www.banktech.com/sep05.