Over the past few years, a lot of attention has been paid to the idea of a comprehensive reference-data 'utility' to address the securities industry's need for clean data for STP and other initiatives. Conceptually, this amounts to some form of 'neutral' intermediary between the data vendors and their clients to amalgamate, clean, and support the data inputs for trading, compliance, and money-management operations. Such an initiative promises reduced trade-settlement times and mitigated business risk, just to name a few obvious benefits.
It's no wonder this idea has currency, for these reasons and others. Perhaps the question burning brightest in investment-data management today is how to drive down the recurring costs of market and reference data. At the biggest firms, these costs are monumental, as several studies have reported. The cost of reference-data feeds alone tops $3 million at the bigger firms, according to Tower Group. In addition, the function is typically overseen by a staff averaging nearly 60, or far higher for some institutions. Still, 40 percent of all trades fail to settle due to errors that persist despite this huge spend. In the cold light of the current economic day, these costs stand out as a ripe target for reduction.
It's only logical that leading firms wonder if there could be a better way entirely; i.e., a more leveraged approach that would eliminate duplication of effort around the industry in managing this data. In short, how about a utility that could provide a clean, normalized data 'spigot' all comers could drink from, with reduced concern for errors, omissions, and other worries?
In response, a wide range of technology vendors, consultants/system integrators, service providers, venture capitalists/private-equity firms, and major data vendors themselves have explored these ideas with an eye toward building this over-arching industry data solution.
The problem(s): without access to huge amounts of capital, appropriate technology, compliant data vendors, data-management skills in an impossibly broad set of asset classes and, most importantly, a willing set of major customers - all coordinated perfectly - the idea hasn't gotten traction. Synetix, the Reuters/Capco JV, and Access International, which promoted this idea in concert with IBM, have been two of the causalities of this phase.
But all is not lost. While it may not be obvious to the casual observer, what I'll call the 'vertical utility,' serving narrower, specific data needs, has made significant inroads. The vertical-utility approach has two advantages for clients and vendor participants: capital and organizational overhead issues are reduced, since the sheer scope of the problem is more manageable; and the need to rewire systems on the client side to accommodate a fundamentally new approach to data acquisition is reduced or eliminated.
Where such approaches have yielded commercial offerings, they have tended to be characterized by focused combinations of data aggregation, technology infrastructure, data-management staffing, and support resources that address a manageable set of data problems with deep, specific expertise.
One example is DTCC's Corporate Action Hub, a real-time messaging solution designed to automate and centralize the point-to-point exchange of corporate-action information that occurs among investment managers, banks and broker-dealers in all markets. The DTCC's service is a classic example of a centralized solution to an industry-wide problem " solved, importantly, by creating a single managed 'hub' for a specific, highly specialized dataset.
Such data alternatives " 'vertical utilities' that provide a time and cost-efficient means of dealing with a classic data headache " appear to be useful blueprints for the evolution toward a broader utility concept.
The inherent processes that inform these services suggest how the saga of data 'utilization' might proceed from its current state. There is no lack of annoying vertical-data-management issues to tackle in the bowels of the investment business, and no lack of alternative ways to tackle them. You know them already: do it yourself (the leading current solution); hire expert outsiders to handle them for you (outsourcing, an up-and-coming concept in the data-management arena); find vertical-utility data-service solutions delivered through central-service bureaus with specific expertise (witness the DTC solution); or some combination of all of the above.
Ultimately, the smart money says more vertical, mini-utility solutions will appear, and concep-tually, could eventually be cross-wired to produce the larger mega-utility that is, or was, the dream. In any event, clients today can buy from an expanding a la carte menu that serves their specific needs in any combination that suits them. That's ultimately what clients want and need, anyway. It's also why a one-size-fits-all approach to supplying the mythical golden master isn't likely to appear in our lifetime.
* * *
Grant Slade is Executive Vice President of Marketing at Iverson Financial Systems, Inc., which has offices in Sunnyvale, California and New York City. Iverson has been supplying research-grade securities-market data and related services to leading institutional-investment firms since 1983.