With the widespread distribution of trading algorithms by both the sell side and third-party vendors to buy-side and sell-side traders over the past several years, algorithmic usage has gone beyond niche use by stat arbs and other quants implementing their high-speed and/or complex strategies for mainstream use. Now, with the vast availability of scalable and customizable second-generation algorithms, spending some time trying to figure out the logic and psychology behind an algorithm not only is becoming increasingly important, it is imperative to the success of any trading strategy.
Ideally, if the algorithm is built correctly and the proper one is selected, it will trade an order the way a human would, arguably with more consistency and with greater efficiency, allowing the trader to focus on higher-value-added aspects of the job, such as deciding which algorithm is the correct one and focusing on the level of urgency required by the portfolio manager. However, in order for traders to be true experts in selecting algorithms and intraday trading strategies, they need to be able to see the algos in motion.
Lack of Transparency
Today, we only can say we know what a specific algorithm is supposed to do, measure its pre-trade analytics and see how the post-trade results match up to that expectation. But by then it's a little late if the trader didn't select the most optimal algorithm for that trade. This problem is caused by a lack of visibility and transparency into the algorithm while it is executing orders.
In a pre-algo environment, when trading was done by a human on electronic communications networks (ECNs), alternative trading systems (ATSs) and exchanges, we knew how much of our order was displayed where, for how long it was there, how much was printing on the bid or on the offer, what portion of the order was crossed in various ATSs, and so on. When we employ an algorithm to trade a specific order, we lose that visibility due to the lack of transparency given by the provider.
An undeniable factor in real-time transaction cost analysis is a trader's "gut feel." It's the intraday trading characteristics of a stock that assist a trader in determining if backing off or getting more aggressive is the right thing to do. It is too early in the development of trading software to think that the thought process of an algorithm can mimic that of a human trader. And pre-programmed instructions always will have a difficult time competing with the ability of the human brain to react to unanticipated consequences (and opportunities). The combination of the two most likely will yield the best results.
Some providers have begun to address this issue somewhat by offering instant messaging (IM) services that work with the algo. As it trades, you get alerts of issues that arise that you may have otherwise missed unless you were actively monitoring the algo. If you were trading the stock manually, without an algo, good traders would know when a block went off around them, where it traded and why. Similarly, if a piece of news came out that would cause you to alter your strategy, you need to know about it. Now these instant alerts keep you up on your stocks when you are using the algo.
Dynamic Transparency Is Key
In order to help understand algorithms and get where we need to be, dynamic transparency is imperative. A real-time, dynamic display of what venue the orders are being sent to or posting at, when they are canceled, and where they get filled are things that traders need to see in order to allow their guts to play a proper role. Was it a bid that got hit, or was it an offer that the algo took? Answers to questions such as this will allow a single stock trader to evaluate if that algo is the right one for the situation.
To take it one step further, program traders would need this data aggregated in order for them to switch strategies. There is a lot of talk about real-time analytics. Will it be necessary? How will it work with both pre- and post-trade transaction cost analysis (TCA)?