When it comes to devoting resources to data management, most securities firms opt for the status quo, which means allocating as little as possible to the function. After all, many on Wall Street assume that data management doesn't provide a competitive advantage.
"When we started looking at data in securities firms, it became apparent that in the average firm it's a mess," relates Matthew Nelson, a senior analyst with Needham, Mass.-based TowerGroup. According to Nelson, a typical firm has 10 to 30 different, somewhat redundant databases with discrepancies and errors, as well as 30 or more applications that consume reference data and incoming data feeds from as many as 15 third-party providers. The consequences of such "spaghetti diagrams" of data stores can be dire, he says. Yet most firms continue to do no more than the minimum required to address enterprise data management.
The most critical cost associated with clinging to legacy data management systems and strategies, asserts John White, head of State Street Global Advisors' investment management data services group, is unnecessary risk. Not only do Sarbanes-Oxley and anti-money laundering rules require investment firms to closely scrutinize data management processes, the sheer volume of trading and growth in assets under management in securities firms magnifies the level of financial risk accrued by companies that don't make data management a priority, he points out.
"Whether physical or logical, there must be a designated version of the truth that all downstream processes read from," says John Bottega, chief data officer at Citigroup. "Without this consistency, processes break, exposing firms to increased operational, financial and reputational risk."
Dated, reactive data management processes can leave a company unaware of its vulnerabilities to other firms' misfortunes -- when Enron failed, for example, it took some firms weeks to find all their points of exposure because of the fragmented, disjointed nature of their data, according to Gartner analyst Mary Knox. "You cannot adequately manage risk unless you have the right data and the data quality to understand what the risks are, to analyze and respond to them," she says.
Another cost of not upgrading data management capabilities is failed trades, as inadequate data management processes mean firms are certifying data after the fact. According to TowerGroup's most recent Reference Data Survey, inaccurate or inconsistent reference data and poor data management processes accounted for nearly 60 percent of all failed trades in 2005. About 60 percent of any given trade record is composed of reference data, TowerGroup says. If that reference data -- security master data, account numbers, customer information, counterparty data, etc. -- is stored inconsistently in disparate CRM, accounting and order management systems, as well as in incoming market data streams, trade breaks are certain to occur.
Losing to Latency
Even if a firm overcomes poor data management and is able to execute trades, it likely will experience some latency as a result of inadequate processes (or technology). If your trades take 10 seconds each to go through when your competitors' take one second, at best you'll have price discrepancies and unhappy customers, and at worst, you'll lose customers entirely. Consider a foreign exchange desk with traders in New York, London and Tokyo. With a latency of just a few seconds per trade, these traders could actually end up trading against each other because their positions are out of sync.
One available technology for accelerating the movement of trade data is GemStone's enterprise data fabric, which makes the RAM on many machines in a network look and act like one large cache, enabling millisecond data replication speeds, according to the vendor. Still, many firms are content to continue to do business as usual, having calculated that the risks of slower trades, or failed trades, is acceptable.