All financial services needs accurate and timely risk analysis but no amount of investment in risk systems will matter if the data isn't clean. Data governance and a solid data supply chain is, in fact, the first step to making sure analysis is accurate and timely.
Sound data governance includes knowing who is responsible for each data set, standard quality of data and a standardized processes for submitting data. Mass adoption within a firm makes transparency easy and allows for quick response to regulatory scrutiny.
"Enterprise data management matters not just to big banks, who were the early adopters, but also to the hedge funds and asset managers. We're seeing a mass adoption from them; they're coming to us in droves," says Stephen Engdahl, senior VP of product strategy at GoldenSource, an independent provider of a platform that aggregates and standardizes data from internal databases and external vendors for reporting and monitoring purposes. As new financial service firms jump on board the enterprise data management trend, the industry has worked to tweak their offerings to meet particular challenges. "Smaller firms never before recognized they needed to get serious about data management, so that's changing the shape of how data management is offered ... The willingness to go for a service provider or outsider is much stronger than it was 2 or 3 years before. It drove us to issue a on-demand platform that allow greater number of middle and smaller firms to adopt these practices. They need solid data management platforms too."
In addition to big banks and small asset firms, stock exchanges have started to pursue data management technology as much for operational efficiency as for profit. "We are seeing exchanges and depositories and central players in the market exploring concepts to make the most of data they generate to grow additional revenue," explains Engdahl. "It's happening globally and quickly, there's much more activity now than even in April."
Exchanges have found that they can use GoldenSource's platform as a depository of information on listings, corporate announcements, corporate actions, mergers and other factors that affect listing. They make that information available in premium real-time data feeds to market participants while removing the manual data once required to key in all the attributes. In one case, an exchange that tied their new platform to the public web portable was able to increase their average of 20 attributes per event up to 200.
"From the perspective of exchange, with transaction volume going down, they see transaction revenue declining, so this is a natural output of operations. Many already offer real time data, and they've done well on the real time side, but they're seeing the untapped market for reference information around tickers," says Engdahl. "Reduced manual work, and the growing revenues from this service, reflect the fact that there is demand for this information, and exchanges can provide premium for dreamer depth and speed of information delivery."
[The 5 Pillars of a Data Integration Platform]
As for market participants, 200 attributes per event may be hard to digest, but for paying customers, customization is always an option. Engdahl explains customers can select how to receive the information and reduce time spent on mapping and transforming data for their own applications, which greatly reduces effort on participant side. "They have other options for outsourcing information, but if a central trusted exchange is doing cleansing and data operations around sourcing this data, that's efficiency," he says.
Data Management is a Business Problem, Not a Technology Problem
When new types of customers are on-boarding to a new platform to meet regulatory scrutiny and save money, it becomes a business decision as much as, if not more than, a technology decision. Engdahl explains it's the business executives who are taking the harsh look at which functions within the organization contribute to their strategic advantage. "If they can find a more strategic and economic way, they'll chose it." However, this cleansing and outsourcing of functions can have big consequences for business managers not practiced in the art of communicating their particular data wants and needs to the technology side.
"In larger firms, big banks, there's more risk of Tower of Babel, where departments refer to things in their own way," says Engdahl. In response to the frustrations and confusions on both the business and technology side, GoldenSource, along with the EDM Council (a not-for-profit business forum for financial institutions), has produced a sort of glossary of acceptable business terms, called FIBO, that refer to specific technology functions. "FIBO is really useful in large firms. In the smaller firms they typically don't have deep technology function and they need to do everything in business terms to begin with, so FIBO mapping made communication with outsourcers easy."
So far, the glossary has been a success. "We've seen the FIBO interest across the board. We had a greater level of inquiry than any other announcement, so that is pretty universal." He adds, "When we did FIBO mapping it was immediately adopted as a way to on-board other departments. Before there was always a lot of back and forth between departments discussing data migration. Now they are able to point to terms and check off the ones they want, because in business terms it's easy to specify data requirements. It's presented in a way where we can say, 'this is exactly what we need published' leading to faster roll-out of data management platforms."
Becca Lipman is Senior Editor for Wall Street & Technology. She writes in-depth news articles with a focus on big data and compliance in the capital markets. She regularly meets with information technology leaders and innovators and writes about cloud computing, datacenters, ... View Full Bio