Exchanges

04:31 PM
Fred Federspiel and Alfred Berkeley
Fred Federspiel and Alfred Berkeley
Commentary
50%
50%

High Frequency Trading and the Evolution of Liquidity in US Equity Markets

Recently, volume and profitability statistics have focused popular attention on the rise of high frequency trading. The Tabb Group now credits it for over 70% of all US equity trades, and estimates profits of over $20 billion annually. Some of what has appeared in the media and in the blogosphere implies that high frequency trading is somehow wrong or that the investor is being victimized.

How did we get to this point?

The dynamic, vibrant give and take of liquidity drives a thriving equity market in the United States. Because of technology, market structure, and regulatory developments, the fundamental nature of this give and take has changed radically over the past decade.

For hundreds of years, providers of liquidity have worked to earn the spread, buying at the bid and selling at the offer – simultaneously buying low and selling high. They also strive to profit from the change in the price of the underlying shares – “alpha” in the language of the quant.

Together with the explosive growth in technical capabilities throughout the decade, two changes have upended the traditional liquidity-provision role played by specialists, dealers and market makers. First, upstart ECNs (and later exchanges) started paying their subscribers to provide liquidity, collecting revenue just from those who chose to demand liquidity. The so-called “maker-taker” pricing models encouraged traditional market makers to post their quotes on these ECNs, but the concept also opened a Pandora’s Box of experimentation with anonymous, automated market making models.

Secondly, decimalization literally decimated the opportunity to profit from simple spread trading. Rather than “maintain an orderly market” as a specialist or market maker, it became more profitable to trade anonymously, and offer liquidity only in the direction indicated by a short-term alpha model. Because of these developments, liquidity that had been provided to the market by regulated providers has come to be provided by unregulated ones; the specialists and market makers have been substantially replaced by electronic, anonymous liquidity providers – “high frequency traders.”

Where are we now?

Let’s start by following the money: high frequency trading strategies today profit from two sources: liquidity rebates paid by the markets, and smart intra-day timing of their trades.

Some high frequency strategies – the so-called rebate harvesting algorithms – do much more than simply harvest rebates. Many rely on very short-term alpha predictions to create profitable trading opportunities, and the best of them wring every timing advantage possible from the markets. When the alpha is strong enough, they will cross the spread to take advantage of short term fluctuations. Operators of these strategies will co-locate at the exchanges or ECNs in an attempt to be the first to act on any signal to enter, or pull back from the market. The extreme measures they deploy to time their orders results in adverse selection losses for the institutional orders on the other side: the institutions trade more slowly when they should have traded quickly, and they trade too quickly when they should have held back.

Other high frequency trading approaches – a class of stat-arb strategies sometimes called information arbitrage – look over longer timeframes in an attempt to detect asymmetries in trading interests, and then profit by trading before institutions have a chance to finish their orders. These intra-day timing tactics have been called “front running” or “penny jumping”; they directly generate market impact losses for institutions.

Interestingly enough, the evidence shows that institutional trading costs, taken in total, have remained remarkably constant during the transition to high frequency trading. The implicit components of transaction cost – adverse selection losses, and direct market impact losses – have indeed been driven up, but the commission component has decreased in measure.

Previous
1 of 2
Next
Comment  | 
Print  | 
More Insights
More Commentary
Data Integrity: A Necessity, Not an Option
Financial institutions that have taken on the data integrity task in the past now have to spend more money on hardware, software, and people just to keep up with the demand.
What Colombia’s New IT Campaign Means for Latin American Tech Investment
Colombia’s campaign is the latest example of how Latin America is trying to edge into the global technology space.
Initial Margin: When Does More Turn Out to Be Less?
Changing margin regulations are set to affect the OTC derivative market, including initial margin risk models for non-cleared OTCs.
The Mainframe Innovation Drag
It may be time for a consortium of firms motivated around the objective of eliminating the mainframe. What if every self-clearing firm decided to participate in building a modern, back-office system as an open-source, cloud-based project?
Big Data DIY
Now that we have passed the initial hype phase of big data, companies are searching for real business value from their investments. Consultants can play a part, but only if financial firms insist on a new partnership model.
Register for Wall Street & Technology Newsletters
White Papers
Current Issue
Wall Street & Technology - July 2014
In addition to regular audits, the SEC will start to scrutinize the cyber-security preparedness of market participants.
Video
Exclusive: Inside the GETCO Execution Services Trading Floor
Exclusive: Inside the GETCO Execution Services Trading Floor
Advanced Trading takes you on an exclusive tour of the New York trading floor of GETCO Execution Services, the solutions arm of GETCO.