The past six years have seen dramatic changes in central banking policymaking with the advent of macroprudential regulation. These new regulations require central banks and statistical reporting agencies to gather and analyze micro or company-level data.
Financial accounts and micro data
Financial accounts and statements about financial institutions within a given economy remain key data sources within central bank warehouses. This time-series data provides insights into sectors of the economy, and much of it spans multiple years and institutions. However, it sometimes lacks all of the necessary company-level, micro-oriented inputs to help identify systemic risk from the failure of companies in key sectors.
Macroprudential policymakers analyze these micro-level financial ratios and company fundamentals in order to identify domestic and cross-border threats to financial stability. Therefore, one solution to the challenges posed by macroprudential regulatory demands is the management of more micro, company-level data with the ability to aggregate it up to the sector level. These sector-level aggregates complement the macroeconomic data relied upon by policymakers.
Addressing the 90/10 problem
Statisticians and economic researchers at central banks are now grappling with the ”90/10” problem traditionally faced by equity and securities research analysts in the private sector. This arises when skilled knowledge workers with specific quantitative and mathematical expertise end up spending 90% of their time gathering data and only 10% of their time analyzing it.
Research and statistical analysis experts at central banks are increasingly tasked with assembling micro, granular company-level inputs, while at the same time being asked to develop new, sophisticated forecasts and statistical models that identify potential systemic risk to the economy.
However, these PhDs, economists, accounting experts, and central bankers end up spending the majority of their time collecting data rather than developing these sophisticated models. As a result, quantitative-oriented researchers and expert statisticians end up acting more like data entry clerks than expert forecasters and modelers.
New micro-data management challenges
To ensure that analysts and statisticians spend less time gathering data and more time analyzing it, central banks are seeking solutions that bundle cleansed, aggregated micro-level company data with tools that facilitate the loading and co-mingling of proprietary macroeconomic data in the same format.
With data stored in a consistent and harmonized format, these skilled analysts and economists can then efficiently apply sophisticated models and statistics in order to produce the important reports that influence today's macroprudential-oriented policymakers.Ken Rossiter is a senior product manager for SunGard's MarketMap analytic platform and analytic consulting services. He has 22 years of experience in the financial services industry and analytic data management systems in particular. Prior to joining SunGard in 2004, he ... View Full Bio