Data," says John Parker, executive vice president and CIO of A.G. Edwards & Sons, "is the center of our universe." Whether it reflects some changing event or static details that denote clients, counterparties and securities, he explains, timely and accurate data is critical to better investment decisions, greater automation of trade processes and more certainty when it comes to regulatory and customer reporting.
Not surprisingly, market participants across the industry - from brokers to custodians to fund managers - are pushing for the highest quality data possible, and many are undertaking technology projects in an effort to rationalize their data environments and make larger amounts of high-quality data available to users more quickly.
Regulations Raise the Bar
As Tom Jordan, CEO of New York consulting firm Jordan & Jordan, observes, different information may be required depending on whether it is a custodian or securities firm, but there is a broad recognition of the importance of data across all segments of the securities industry. Indeed, it is an issue that has been gaining more attention, driven primarily by the emphasis in the industry on risk management, relates Jordan. "It is really along the line of understanding what securities you have within your portfolio, whether you are on the buy or sell side," he says.
Risk management, Jordan notes, has a regulatory aspect to it and is no longer a discretionary activity. "From a regulatory point of view, if you are misrepresenting your positions or, just as important, if you are misrepresenting to your clients what the value of your securities are, that really creates more motivation for people to do something" to regulate reporting, he says.
According to A.G. Edwards' Parker, regulations such as Sarbanes-Oxley, Know Your Customer rules and anti-money-laundering initiatives are driving market participants to seek data-related solutions. The choice, he notes, is whether to adopt a tactical approach to compliance or a more strategic approach to meeting requirements.
"If you have reasonably good data, there are ways you could [comply] without the big data initiatives, so you would tackle it more as a [tactical] regulatory initiative," says Parker. "But it is much more efficient to do it by cleaning your data at the same time, and that tends to be the way we are looking at it," he says of A.G. Edwards' strategic approach to data management.
Having addressed problems and opportunities in a tactical manner historically, A.G. Edwards found itself with an array of systems silos, each with its own data support. As a result, trying to integrate them was a challenge, relates Parker, because the databases that supported those systems were separate and set up differently. "Cleaning all that up was and is a huge opportunity to improve not only our efficiency and cost position, but also the decision making and productivity of our workforce," he says.
According to Parker, the starting point has been operational data - the data the firm uses on a daily basis for core activities, such as trade processing, account management and customer relationship management. Information has been divided into subject matter "buckets" that are filled with the requisite data; at the same time, operational data is extracted and put into an analytical data warehouse, Parker explains. "So we have an operational data store and analytical data store," he says.
The data warehouse is being rolled out first to the firm's marketing people, who mine the data - examining client holdings, for example - to identify sales opportunities that they then package and work on with the branches, relates Parker. Eventually, the capability will be rolled out directly to the firm's branch managers and financial consultants, he adds. The heavy lifting should be completed by the end of 2006, with three or four deliverables between now and then, Parker adds.
"We will certainly see cost improvements from the technology simplification, and I believe we will see cost improvement from the workflow and productivity tools we can put on top of it," says Parker. "The bigger part, though, is the benefit to the clients, because our financial consultants will have better data with which to work and also will have a better understanding of the frequency in which they are talking to clients."
A Rational Master
State Street Global Advisors (SSgA), meanwhile, has been working to rationalize its separate security reference systems into a unified Market Data Framework. "When we had a number of different security reference systems, we had three different groups running those, but now we are going to fall under one heading and you have a lot of efficiencies," says John White, global manager, investment management data services, SSgA. "And not only is it the efficiencies of processing the data, but of creating a level of transparency of content throughout the organization. So what the people on the research, portfolio management and client reporting sides see is all coming from the same source, which creates a level of consistency across the entire investment cycle within SSgA."
SSgA has adopted a hybrid approach to its technology efforts, developing parts of the system in-house and outsourcing the ETL (extract, transform and load) tools. SSgA uses a service provider to aggregate the disparate sources of pricing and security reference data into a universal file format. "From there, we load it into our security reference system," explains White.
The project is ongoing, with a phased implementation timetable. "We look at our entire market data framework as constantly evolving - there are newer types of securities and newer groups always adding to it - so it is not necessarily something that will ever be fully completed," White says. "If anything, it will just get more groups attaching to it, or attaching to a system that is fed by this security reference system."
State Street's Investment Servicing division is undertaking a similar initiative, albeit on a larger scale, given the requirements of the business, according to Peter Cherecwich, senior vice president and head of product and technology solutions for State Street Corp. "We have built something we call Enhanced Security Master, which takes in feeds from multiple vendors and allows you to compare the different vendors and do a quality check," he explains.
The firm has put the equities portion of the project in place and is now moving on to the derivatives side. "We are probably two years into implementation now, and another year to 18 months away from completion," Cherecwich says.
The division also is looking at implementing a service-oriented architecture (SOA), adds Cherecwich. "We're so diverse - in so many countries, with so many different applications - that to be able to have an XML layer that shows the data that all the different systems can look at is a big benefit," he says. "However, we're still some way away from that on a fully implemented basis."
Despite technology advances in data collection, scrubbing and downstream distribution, such as SOA, most firms still are some way away from achieving a clean, automated data environment. To begin with, there is just so much data being pumped out, and the appetite for information, increasingly presented in real time, continues to grow. As Cherecwich comments, "How do you get your hands around it and organize it in a way that you can make decisions and present that data to all the users in an easy-to-deal-with fashion?"
According to SSgA's White, the biggest challenge is cross-referencing - "using disparate vendors [and] being sure that you are comparing apples to apples." And, with the introduction of more and more esoteric securities, simply obtaining the necessary information can be difficult.
And even when you build a common data infrastructure, says Jordan & Jordan's Jordan, "You have to do some marketing on why it makes sense from a corporate point of view" to get buy in. If nothing else, the threat of the big regulatory stick should offer some leverage.
On The Net