Trading Technology

12:20 PM
Cristina McEachern
Cristina McEachern
News
Connect Directly
RSS
E-Mail
50%
50%

As STP Pressure Grows, Financial-Services Firms Tackle Data Management

In preparation for T+1, market-data vendors are touting STP-specific products and services.

In preparation for T+1, many data vendors are touting STP-specific products and services, as some firms are looking to buy rather than build a data-management solution. Whether it's market data, reference data, historical data or any other information necessary for completing the trade process, financial firms are grappling with the best way to automate the data process and ensuretrades are settled in a timely fashion. But with that wide-ranging data stored in silos across the firm, the task of integrating the data, which may be in various formats, and automating across many different areas, has become a complicated one.

One major question firms are beginning to face is whether to improve data management internally or look to a third party to help with products and services. As more firms are coming to this fork in the road, more vendors are touting their options and looking to help financial-services firms to manage data and move to straight-through processing.

The Data-Management Process
The data-management process centers on the acquisition, normalization and scrubbing of data or, in other words, the task of getting the data, getting the data in the right format and making sure the data is correct, with no gaps or errors. Why does this become such a daunting task? Because in order to ensure complete global data coverage, financial-services firms must subscribe to multiple and often redundant data services. And in turn, the firm must compare the data from each of these sources to normalize the information or make sure it matches up, which is a mostly manual, time consuming process.

The first step to improving this process and moving towards automation is to identify where data is stored by doing a data inventory of the various models, says Tim Lind, senior analyst in the Investment Management Practice at TowerGroup. Some questions to ask are: Where is the data stored within the organization? What is consistent in those data models and what is different? "In other words, do I use a CUSIP to identify securities within this database and an ISIN within this one and then have to translate between the two?" asks Lind.

While emerging data standards such as Market Data Definition Language could help firms describe and define data elements more uniformly and, as a result, more easily scrub and compare data between different vendors and internal standards, it may take time for these areas to evolve.

"I don't think any vendor necessarily sees their codes as strategic," says Lind. But he adds that the data vendors probably will wait until their customers demand more standardized data before they really make the drive to distribute in standards such as MDDL. "Standards are always slow to implement and develop because legacy applications are so entrenched."

Lind explains some other options, such as outsourcing, are emerging to help firms when tackling their data-management problems.

"One of the activities that some are looking to outsource is the acquisition, normalization and scrubbing of data, which is not necessarily something that can be seen as a competitive advantage," says Lind. "They all have to do it, so isn't there a way they can outsource this particular burden?" In general, some of the trends that Lind sees in this area include the possibility of custodian banks offering some data management to their customers.

"For example, when a firm nominates State Street for fund accounting for mutual funds, in fact, they are outsourcing the maintenance and acquisition of reference data, so I think that's an important solution that custodians are going to offer," says Lind.

But he cautions that this option may be best for domestic-equity shops in particular and that the more complex and global the securities are, the more difficult it may be to depend on a third party to create and maintain the data. "Within the enterprise and within a certain buy-side firm, they probably have very specific requirements, say, if they trade a lot of fixed-income securities or mortgage-backed or asset-backed securities, they may not find the quality or depth of information from an outsourced provider," he points out.

Assessing the technology
A growing number of vendors are also touting specific products and services to address the issue of internal data management in preparation for straight-through processing.

In this case, one key to improving data management is propagating data additions and changes automatically across the enterprise. Lind describes this as, "a sort of command-and-control department that can tell every time a record was changed in a certain application and be able to approve that change and then propagate that change to other systems in an automated way." He adds that this can be seen as a central device or hub that all data flows through.

More specifically, Lind compares this type of technology to the sync button which updates information stored in various areas such as cell phones, palm pilots and Microsoft Outlook.

"The problem isn't necessarily that the information is in three places, the problem is when Outlook is changed it doesn't necessarily update records that are on your cell phone or Palm Pilot," says Lind. He adds that an enterprise-type "sync" button would then allow normalization and communication between different applications. "This sort of interconnectivity of databases is an interesting step forward in efficiency," he says.

Adding to the hub and spoke idea for data management, Lind says that it's also important to centralize the actual people responsible for data maintenance into a business line itself.

"An independent department would be better able to get you the right information and get a level of expertise on the descriptive attributes of the data record," says Lind. "Operationally I think the functions are being more consolidated and centralized in the maintenance of data."

_________________________

>> A Vendor Sampling <<

Reuters
Following the release of a survey on reference data and the move to T+1 -- which found that inconsistent or incomplete reference data was a significant cause of trade failure and that the mostly-manual process of maintaining reference data costs financial firms an average of $3.2 million per year -- Reuters formed a joint venture with Capco, called Synetix, to provide data-management products and services. (See Wall Street & Technology December 2001) The first product to come to fruition under the Synetix venture is the Reference Data Manager, a tool that allows users to aggregate reference data from multiple sources. By creating this single view of all reference data, users can find inconsistencies and inaccuracies in the reference data more efficiently, says Cormac Kelly, chief executive officer of Synetix, the Reuters/Capco joint venture.

"Reference data is an issue that all players have -- any firm that is capturing and recording transactions for clients or with counter parties has the issue of recording reference data," says Kelly. "And what we've found is that it's normal in the industry that people have the same items of data represented differently." He points to an example of the U.S.-dollar data element, which might be represented as USD, 001, US Dollars or a dollar sign, depending on who is entering the information. The Reference Data Manager product maps the various reference data sources -- whether it is 10 or 100 -- to reference each of the data elements as the same thing in order to bring information together for movement back and forth, regardless of which data item for U.S. dollars is used. "This way the quarks and idiosyncrasies of every system and how they capture, represent or utilize data is captured into the system and doesn't have to be changed or adjusted."

Asset Control
Asset Control is leveraging its platform which features "golden copy" -- information at a central hub in the data-management process and a single-verified-standard version of the data that is used to populate all of a firm's databases. The Asset Control platform is connected to a user's data sources with application-programming interfaces, which are either standard to the vendor or can be built quickly for new or internal data sources. The data then comes into a central engine which standardizes it for normalization and cleansing and then flows it back out to the appropriate applications or storage areas.

"Users can manipulate and manage all of the data in Asset Control and prepare their golden copy, which ultimately is distributed based on point-to-point connections or middleware distribution," says Ger Rosenkamp, founder and chief executive officer of Asset Control. There is also an outsourced version of the Asset Control platform; a sort of central data utility with a billing module built on top. While most of the larger firms would implement Asset Control in-house, Rosenkamp says that the outsourced version is addressing the needs of the small- to medium-sized firms that are looking for data management.

Bloomberg
While Bloomberg is best known for its market data and more of the pre-trade needs of users, the company is also working to help buy-side clients achieve straight-through processing with trade-order-management applications on the Bloomberg platform. Bloomberg offers its own trade-order-management system (TOMS) as well as a proprietary middleware product (Gateway) which can connect portfolio-accounting systems onto the Bloomberg platform. "Bloomberg data and other data can flow out of the TOMS into other third party applications and then data can come back," says Charles Garcia, manager of straight-through processing at Bloomberg.

Bloomberg has also signed on as a concentrator, or transport mechanism, into the GSTPA's Transaction Flow Manager. "When a trade is put into the Bloomberg platform, it will be tagged by the user which destination or which virtual-matching utility they require," explains Garcia. "As a concentrator we then take the terms of that trade, put it into the TFM, then the broker/dealer, the money manager and the custodian look at the terms, execute the order and then send it back to the Bloomberg platform for verification." Bloomberg will also be looking to connect with Omgeo and any other utilities that will facilitate straight-through processing for its users. As for the sell side, Bloomberg works with customers to input third-party and internal data onto the Bloomberg platform in order to centralize the information and the processing in preparation for T+1.

Comment  | 
Print  | 
More Insights
Register for Wall Street & Technology Newsletters
White Papers
Current Issue
Wall Street & Technology - July 2014
In addition to regular audits, the SEC will start to scrutinize the cyber-security preparedness of market participants.
Video
Exclusive: Inside the GETCO Execution Services Trading Floor
Exclusive: Inside the GETCO Execution Services Trading Floor
Advanced Trading takes you on an exclusive tour of the New York trading floor of GETCO Execution Services, the solutions arm of GETCO.