Historically, enterprise data governance has been treated as the business equivalent of organizing the family garage. Everyone means to do it, but it usually loses out to more pressing tasks, with the rationalization that current data management processes and controls have been good enough so far.
In the aftermath of the 2008 financial crisis, regulators of all stripes are disabusing organizations of the notion that their legacy data governance policies and controls are likely to meet all the new reporting requirements.
Enterprise data governance no longer is hiding in the shadows and playing a supporting role, according to Mike Atkin, managing director at the Enterprise Data Management (EDM) Council. "Risk has risen to the top of the agenda and now is balanced against the business agenda of making money."
A prime example of regulators sharpening their focus on data quality is the Basel Committee on Banking Supervision's BCBS 239 regulation, which was published in January 2013 and contains 14 specific principles that address risk data management, calculation, and reporting practices.
"There is a lot in there about having a strong governance structure and framework, as well as board-level participation in data governance," says Jennifer Ippoliti, chief data officer at Raymond James.
Most regulators plan to aggregate all new and current regulatory filings, so that they can understand market structure and its operations better. However, this works only if firms submit the cleanest data possible. "When you aggregate data is when you start seeing the chinks in the armor in terms of data quality," Ippoliti says.
Improving data governance is no small task, according to Atkin. It requires organizations to unravel the data from their business processes, align it to the most important data attributes, trace that to authoritative data sources, manage the quality criteria and business rules, validate processes, trace it to the source, separate it from the compounding- and derived-data processes, harmonize that against the meaning using standards and common data dictionaries, make the proper data transformations, and deliver it to the client's application. "Everyone is doing this now, and it's a bear," he says.
According to the experts, organizations should not expect that implementing or improving their data governance plans will be a one-time, finite project.
Atkin says firms can document their data flow correctly and maintain it against their inventory of activity. "This can be done in a relatively short time if you know what you are doing and have the resources to do it. Doing it in the light of acquisitions or integration of various business silos into an enterprise view is a lifetime's work."
Enter the CDO
However, don't expect to rely on individual business units to maintain data quality and consistency across the enterprise. Without enterprise-defined policies and controls, data quality often suffers.
To solve this problem, many financial firms have established a chief data officer position. This role has been popping up across the industry like mushrooms after a forest fire. In the past 12 months, Raymond James and Northern Trust have named their first CDOs and established the corresponding organizations to address these issues.
Kay Vicino, a data veteran and CDO at Northern Trust, said it had strong controls around its business processes before creating its CDO office, but no one thought about identifying critical fields, data ownership, data quality, or data lineage as part of the business process. "Think of the chief data office introducing a new way of thinking about these added concepts, as well as introducing new controls and models that look at data as data and not as part of a business process."
Continued on next pageRob Daly is a freelance journalist who has spent more than 18 years covering the IT industry and more than a decade covering the wholesale capital markets. He has delivered news and insights via print, electronic and streaming media. Rob has written on everything from ... View Full Bio