Market data message rates are increasing an average of 120 percent a year and show no signs of abating, warned industry experts at a market data forum last week. Over the past six months, "We've seen a doubling of the message rates -- it's something we should all be concerned with," says Daniel Connell, president of Harrison, N.Y.-based market data company ComStock, which held the client forum last Thursday in New York.
In general, the increases stem from electronic options exchanges, program trading and algorithmic trading strategies.
Connell said the escalating market data volumes are having a tremendous effect on the market data community -- from the producers to the vendors to the user community and the distributors. "We don't see any reason why this will change," says Connell, adding, "It's just a matter of how steep it will be going forward."
"Most firms have got to stay ahead of this [by upgrading their market data distribution platforms]. If they're not, they're in trouble," warns Peter Esler, former managing director, principal and global head of market data services at Bear Stearns. Esler pointed to dramatic leaps in program trading -- from 10 percent a decade ago to more than 50 percent of New York Stock Exchange trading volume -- as one of the main culprits.
Options markets also are propelling the surge in data traffic. The Options Price Reporting Authority (OPRA) -- which consolidates the price feed from all six options exchanges -- is projecting 110,000 messages per second (MPS) by July of 2005, and that will rise to 130,000 MPS by January 2006 and 149,000 MPS by July of 2006, according to Joseph Corrigan, executive director at OPRA, who also spoke at the event.
To boost capacity for 2007, OPRA is using its budget to pay for bigger boxes at Securities Industry Automation Corp., which processes OPRA's data. "It is going to cost a huge jump in dollars to go up to 160,000 messages per second," says Corrigan, "but exchanges have requested more volume," he notes.
Though it's hard to pin down the main reason for the surge in options data, Corrigan attributes the rise to the two new electronic options exchanges -- the International Securities Exchange (ISE) and the Boston Options Exchange (BOX). "They have forced the floor-based exchanges to become more automated," says Corrigan. He notes the Chicago Board Options Exchange (CBOE) now has multiple market makers in each option who are using auto quoting. Since they don't have to be physically located on the floor, overall traffic is increasing. In the equities world, data volumes are up as a result of a variety of real-time data products demanded by traders, says William O'Brien, senior vice president, market data distribution, at the Nasdaq Stock Market. On April 15, Nasdaq released a new version of TotalView, its premier data offering. The service presents the full depth of book on Nasdaq-listed securities, incorporating the Brut order book as well.
Since January 2004, TotalView's 15-second message peak is up 186 percent, while the average daily message counts are up 190 percent. Since the start of this year, message rates are up 70 percent, says O'Brien. But he says that's only one side of the equation, noting there are no meaningful real-time depth-of-book NYSE products. (Editor's note: The NYSE is currently waiting on SEC approval for real-time OpenBook. The five-second version is still available now, according to a spokesperson.) With the NYSE's proposed acquisition of Archipelago and with Nasdaq buying INET, the consolidation of market centers could cause greater competition and increases in traffic, O'Brien predicts.
Market data experts say these trends are only going to continue. "Program trading and algorithmic trading, that's the competitive edge," says Andrew Goldsmith, director, global head of market data at Dresdner Kleinwort Wasserstein (DrKW). "These exchanges with electronic mechanisms are continuing to grow ... and it's something we need to grasp," he says.
As a result, firms, exchanges and market data aggregators are grappling with escalating capacity and demands for low latency. Because of pressure from electronic-trading desks to cut out latency, a lot of major firms have had to explore direct-data connectivity to exchange feeds, says Esler. "That has redefined real-time," he says.
However, Goldsmith says DrKW does not have any direct-exchange feeds. "I pay quite a bit of money to vendors," he says. "Their service levels require them to get me the data in a justified time frame."
To cope with alarming data volumes, Former Bear Stearns exec Esler says firms need to look at overhauling their market data distribution backbones. "The capacity that exchanges and ECNs request of you is on demand. It becomes much more difficult for a market-data manager to manage this capacity. You can do that through provisioning or (adding) circuits," suggest Esler. "If you're not provisioning your extranets, you can get caught out very quickly in a high-volume situation, and it's not a pretty thing when that happens," he warns.
"As far as handling the exchange volumes coming in, I think I pay the software provider of the market-data backbone to pipe that out for me," says Goldsmith, noting that the DrKW built out its trading floor a year and a half ago. Goldsmith recommends that firms leverage some of the managed network providers such as Radianz. "Quite a few are dedicated to measuring service level -- they have mechanisms to measure bandwidth requirements and how much pressure that puts on the networks in place," he says.
To address the humongous options volumes, Connell said ComStock produced a so-called mitigated options feed last year, which means it reduced the size of the feed by dropping inactive options. Though the vendor got a lot of interest up front, Connell said he could count the number of customers that have taken the feed on one hand.
Traders, however, are not interested in mitigated feeds. "If the guy next to him or on the other side of the phone isn't trading on the same feed, there's no interest," says DrKW's Goldsmith.
Esler agrees. "There is absolutely no interest in conflating that feed -- it's all about getting every tick," he says. "However, because these market volumes are so overwhelming, it's time to start taking every application that comes along and challenging its need for every tick.
Esler advises market data managers to think about "specific use provisions." For example, if the application consuming the data is for mark-to-market or calculating P&L (profit and loss), the user may not need quotes up to the second, he says. "Someone like a data czar is going to sit in front of the project office and say, 'Is there a less costly method to get you there?,'" Esler predicts.
Nasdaq's O'Brien says that data mitigation strategies can have significant capacity constraints. He notes that with the earlier version of TotalView, there was so much traffic coming out of the top five price levels, there wasn't any capacity savings or benefit at all.
Nasdaq has overhauled its technology infrastructure to reduce latency by upwards of 75 percent, says O'Brien. For instance, the electronic stock market is using compression software on the client side that can compress data by two-thirds. In addition, it's distributing feeds from its new TotalView product in binary format as opposed to the older ASCII, which was expanding packet sizes and increasing latency as well, he says. It's also releasing a machine-readable variant of TotalView, which can result in a 44 percent reduction of bandwidth and a 60 percent reduction in latency.
Despite these data management challenges, there's an upside: The skyrocketing market data volumes may have positive implications for industry profits, since the firms for which market data managers work "are contributing to these volumes, which hopefully increases their revenues," says DrKW's Goldsmith.