Few aspects of the IT world have seen the level of technological advances and innovation in the data management space. However, despite these advances in technology supporting the entire data management lifecycle, the data management technology shop is still perceived by businesses to be slow in responding to evolving business needs. A variety of factors are responsible for this, most importantly the lack of an organizational perspective.
Not long ago, the very mention of a data management project conjured up visions of multi-year, multi-phase programs driven by technology teams. It always started with a grand vision of a multi-year data transformation blueprint, a build-up of large enterprise data warehouses, and the creation of multiple data marts to support multiple user groups. Decision-making was focused on selecting the right database platform, employing the right ETL tools and BI tools, and adopting the proper phasing strategy that was palatable to business stakeholders.
Today, technological advances including engineered systems, big-data technologies, and advanced visualization (to name a few) have forced a change in this mindset, at least in pockets of influence. Some IT leaders and enterprise architects with an early-adopter mindset have been willing to onboard engineered systems, experiment with big-data technologies, and drive business-facing pilots in the advanced BI space without adopting a slow, multi-year strategy that was often doomed as soon as it started.
Get a good plumber
However, except for a few leaders in this space, most financial services organizations (even Tier 1 shops) are still bogged down by the conflict between defining the best approach to get the plumbing right (data management infrastructure) versus the best approach to get the right faucets (deliver data insights to help solve business problems). This is not an easy problem to solve.
Take financial services risk management, for example. If you adopt the traditional big-picture enterprise blueprint approach you need to have the right reference data, a golden copy of transactional data, and an effective data governance model at an enterprise level to get the right data for risk management. This is easier said than done, and would take multi-year projects, especially for large Tier 1 shops.
However, recent regulatory pressures won't often allow for a slow response to data management problems. Take the supplemental Basel Pillar 2 risk data aggregation and reporting guidelines for G-SIBs (Global Systemically Important Banks) for example. These guidelines would force G-SIBs to figure out a quick-win approach focusing on data management within the risk domain, i.e., to adopt a more nimble approach. You can extrapolate this problem to multiple business problem areas, from customer segmentation and profitability management to fraud detection. How do we solve this?
If you look at the problem by breaking this up into two distinct areas of data infrastructure (plumbing) versus business insights (faucets), you get a better perspective. You do need an enterprise-level data management steward to own and drive ongoing initiatives on the data infrastructure side. But this has to be logically balanced by LOB-driven initiatives on the data insights side, which are quicker from a time-to-market perspective. This could be partly achieved by balancing an enterprise-level data management organization with LOB-driven data insight teams, but this doesn't solve the conflict issue. This cannot be solved without adopting a strategic approach to this area, driven right at the board level. You cannot solve risk management as a problem until you marry risk management as a concept to the very way businesses are run, measured, and rewarded.
In the same way, you can only solve the data management problem by approaching the problem top-down with complete executive management attention. If you adopt this logic, it is easy to see the need for a data insights governing body at an enterprise level with stakeholders from business, operations, and IT. This cross-functional executive team can help identify business-facing initiatives to leverage data insights (benefiting sales, marketing, risk management, and finance, to name a few), vet ROI benefits, and drive key projects on the "faucets" side. A "run-the-bank" type data infrastructure team would continue ongoing projects to streamline and optimize the plumbing. Alternatively, any large "change-the-bank" initiative would be driven by directions from the cross-functional executive team. There are some thought leaders in the banking space who have already adopted this approach by forming cross-functional Analytical Center of Excellence (COE) executive teams to drive direction.
Net-net, it brings us back to the fundamental issue I broached in my previous post: Is It Time for an IT Reorg? There is a distinct need for the organizational structure to reflect and align to directions set by emerging technology trends. This is a case of end-users driving back-end "plumbing" decisions, very much as BYOD is dictating mobile enablement of technology assets in the user experience space.
The views expressed in this article are my own, and do not necessarily reflect the views of Oracle.