Wall Street & Technology is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


10:32 AM
Connect Directly

Market Data Dilemmas

Industry leaders Focus on Data Centralization to Improve Data Quality

Convening at an industry conference yesterday, financial-services experts said data quality, not cost is the driving force behind data centralization. Although cost savings is a positive result of data centralization it should not be the driver, says David Hirschfeld, managing director of Enterprise Data Services at Citadel Investment Group, who sat on a panel called "The Ultimate Goal: Data Centralization."

"If we sell this thing as a cost initiative, we're probably going to fail. We ought to be honest and tell people that this is about data quality," Hirschfeld says. "If we keep people's expectations properly focused on that, we'll have a much higher probability of success." Speakers throughout the day touted the importance of data quality, which helps reduce risk, costs and trade failure. In addition data centralization will help firms run more productive and efficient businesses.

Hirschfeld explains that because of the three-year average for return on investment of data-management investments, the most persuasive driving factor to gain senior management support should be the quest for accurate data, especially in light of industry risk mitigation and audit considerations.

The panelists pointed out that a centralized data environment will only improve productivity as data will be filtered more quickly throughout the firm, thus allowing traders and operations personnel to react more quickly to that data.

However, challenges of centralizing data are daunting, says panelist Rich Robinson, the assistant vice president of Global Equities IT at Deutsche Bank AG, because of the multitude of sources contributing data to a firm, many with different delivery formats. In addition, he explains that there is a lack of standard definitions, not only within different parts of one firm, but when communicating with outside organizations.

A third panelist, Putnam Investments' Stephen Gouthro, says that the ideal approach to data processing is based on a model that filters data directly from its sources into a centralized warehouse, where it is then distributed into other legacy systems including trading, accounting, research and performance modules. He explains that using a reverse process of feeding data into legacy systems, then into a centralized warehouse creates a never-ending process of data reconciliation. Gouthro is senior vice president of Investment Data Management & Investment.

While there is no easy way to solve the data centralization quandary, Gouthro offers the advice of examining the problem using a business angle. "There is no single solution," he says. "You really need to look at your infrastructure within your business organization in order to find what your technical solution should be. It's not the other way around."

The conference was held yesterday at The Roosevelt Hotel in New York City.

Register for Wall Street & Technology Newsletters
Exclusive: Inside the GETCO Execution Services Trading Floor
Exclusive: Inside the GETCO Execution Services Trading Floor
Advanced Trading takes you on an exclusive tour of the New York trading floor of GETCO Execution Services, the solutions arm of GETCO.