The debate about U.S. equity market structure has intensified in recent months, with high frequency trading (HFT) under the spotlight. This high-speed form of trading seems to have polarized opinion, not only within the financial industry, but in regulatory and political circles.
Proponents of HFT point to narrower spreads, lower execution costs and increased liquidity as benefits. But critics -- including Congressman Ed Markey, who wrote a letter to the SEC describing HFT as a 'clear and present danger to the stability and safety of our markets' -- say that the liquidity provided by high frequency trading firms is fleeting and of little value to long term investors.
It's difficult to see how these two sides will ever see eye-to-eye. But one thing they might—just might—agree on, is that technology has changed the mechanics of our business forever.
[Investors and traders have plenty of questions, and doubt, when it comes to today's equities market structure. For more on the topic, read: Equity Market Structure Confidence at All-Time Low.]
The past two decades have seen a complete transformation in the structure of capital markets. Once the sole remit of open outcry floors, price formation now takes place predominantly on electronic order books. And electronic communications are now used to meld all layers of order flow (passive / active, institutional / retail, proprietary / agency) via a growing number of execution venues (lit, dark, order- and quote-driven). This complexity can be frightening, and may partly explain why some industry participants and observers have launched recent attacks on high frequency trading practices.
But before we start attacking HFT, it's worth noting the similarities that exist between modern trading methods and historical ones. In fact, I would argue that the fundamental processes that guide individuals (and HFT algorithms) are the same now as they were hundreds of years ago, irrespective of the technology we use.
These processes can be broken down into three key stages:
1) Data capture: the ability to aggregate all sources of information that are relevant to your investment / trading decision
2) Data processing: the ability to process real-time inputs along with other relevant information (historical patterns, fundamental data, opinion, analytics etc.) to determine how best to respond to changing circumstances
3) Data contribution: the ability to articulate your response to the market (like revising your quote) based on conclusions gleaned from data processing.
Natural Born Systems
Back in the good old days of open outcry, the systems that performed these processes were based on some pretty elemental, yet wonderfully complex biological technology.
Data capture was performed by your eyes and ears; data processing by your brain (complete with historical data store, analytics and a complex event processing engine); data contribution was via your vocal chords and mouth, and through body language; while your central nervous system provided the underlying message bus that ensured "seamless integration" of all these components.
These systems were not infallible. They could sometimes be susceptible to viruses (colds and flus could dull response times) and required proper maintenance (hangovers could corrupt memory); but they served us well through the ages, particularly in the complexity and creativity of the human brain—something that computers still can't rival.
Rise of the Machines
However, the rise of computerized trading did highlight some limitations in our ability, as humans, to process and respond to information -- namely, speed and scale.
Studies of mental chronometry show the average person takes 160 milliseconds to respond to an audio stimulus (eg. push a button on hearing a sound), and 190 milliseconds for a visual stimulus.
By comparison, HFT systems can capture, process and respond to inbound data feeds in less than 100 microseconds (more than 2000 times faster). It's this super-human ability that appears monstrous to HFT's critics.
Who's Really Calling The Shots?
Still, it's worth remembering that we are still in charge of the machines. Trading algorithms don't go out and build themselves. All of the intellectual property contained within an algorithm is contributed by individuals capable of translating their knowledge of the market into an automated trading process. Sometimes that knowledge can get lost in translation, resulting in costly repercussions for the algo's masters.
The role of technology is pervasive in our age of information. It has completely transformed the way we consume information, not only in financial markets but in our everyday lives. While the benefits and drawbacks of HFT continue to be debated, turning back the clock on technological progress may prove impossible.
Excerpts of this piece are taken from a Thomson Reuters whitepaper titled "A Historical Constant: The Importance of Data Quality" (PDF).
About the Author: Dan Solak is the Global Head of Elektron Feeds for the Trading business at Thomson Reuters. Dan has been instrumental in driving the global rollout of Thomson Reuters next generation consolidated feed Elektron Real Time and also has responsibility for Machine Readable News and Tick History services.