Data Management

12:51 PM
Allan Grody
Allan Grody
Commentary
Connect Directly
LinkedIn
RSS
E-Mail
50%
50%

An FAQ on the Global-Shared Utility for Reference Data

As momentum builds for standardizing on a single set of reference data, Allan Grody of Financial Intergroup Holdings offers a primer on the new infrastructure being driven through the Financial Stability Board.

Allan Grody, Financial Intergroup Holdings Ltd.
View Larger
Allan Grody, Financial Intergroup Holdings Ltd.
What is Reference Data?

Financial transactions can be thought of as a set of computer encoded data elements that collectively represent standard reference data, identifying the transaction as a specific product bought by a specific business entity. It is also variable transaction data such as quantity and amount and other associated referential information such as price data, credit ratings and other types of fundamental data. Analogous to specific component items of a manufactured product, reference data defines a products’ changing specifications (periodic or event driven corporate actions such as mergers, acquisition and spin-offs), occasional changes to sub-components (calendar data, credit rating, historical price, beta’s, correlations, volatilities) and seasonal incentives or promotions (dividends, capital distributions and interest payments).

Why is Standard Reference Data Important?

Reference data should be consistent across each financial transaction’s life cycle and throughout its supply chain. However, duplication of reference data is pervasive in large financial enterprises and throughout the industry, leading to significantly higher risk and operational costs. When reference data that should be identical are not, it causes miscalculated values, misidentified products, and involvement with multiple supply chain partners (trade counterparties, custodians, paying agents, et al) to resolve the problem. These individual transaction failures cause monetary loss, higher labor costs, and the potential for both transactional and systemic failure. In fact, so much of our financial market utilities’ activities are geared to do not much more than assure that data submitted is accurate, that the product or contract and the counterparty and their associated transaction data is accurate before payment is released.

Why a Global Utility for Reference Data?

Standardizing on a common data set of reference data would solve some long-standing problems for the financial industry: systemic risk caused by mismatched counterparty transaction failures; redundant costs for sourcing and maintaining the fairly static referential data that comprise 70% of a financial transaction; unnecessary costs for reconciling, mapping, transforming and securing this data; and failures from improperly and inconsistently aggregating data for reporting of performance and risk, both internally and for regulatory purposes. In the end, it would save most of the $2 billion annually spent by each of the largest financial institutions for duplicate sourcing, processing and maintenance of this data.

A Central Counterparty for Data Management (CCDM) would match multiple incoming sources of referential data, “clear” this data through best-of-breed computer analysis, and “settle” (distribute) industry accepted, CCDM assured datasets to primary participants and, in turn, to their downstream correspondents. This would entail an industry-wide effort not dissimilar to the clearing entities, netting systems and central depositories that emerged as industry-wide solutions to past industry-wide problems.

Who Would Lead the Effort?

Leading this effort could well be the largest of financial enterprises, now categorized as G-SIFIs (Global - Systemically Important Financial Institutions), and a new global standard body, the G20’s Financial Stability Board (FSB). SIFIs absorb the most cost and risk, bear the most burden of regulatory mandates, and are required to set aside operational capital under new capital guidelines (Basel III). They are to be reward with up to 20% capital relief for outsourcing risk, faulty reference data being one such operational risk that would qualify.

This approach to finally resolving the duplicate and faulty reference data problem is made simpler now that the FSB is directing infrastructure projects through member states’ regulatory institutions. The first such project, the Global Legal Entity Identifier (LEI) System (GLEIS) will provide a unique identifier and ‘business card’ reference data for counterparties and other financial market participants in the swaps supply chain. Thereafter, all financial market participants that can issue, process or enter into a financial transaction for any financial instrument or contact are to register for a LEI. The Unique Product Identifier (UPI) and eventually the unique corporate event identifier is to follow, again under regulatory mandate.

What is the Architecture of the CCDM?

The CCDM is intended to be an extension of the federated network architecture of the evolving global Identification system being formulated by the FSB’s newly nominated Global Legal Entity Identification Foundation (GLEIF) Board. The Board is to be empowered to design the Central Operating Unit (COU) that will implement the “internet-like” federated model for organizing separate data bases maintained by local operating units (LOUs). The LOUs are to organize their local registry following a ‘plug-in architecture’ and deploying a ‘network card’. This would permit legal entity identification and associated reference data to be aggregated into a virtual data base accessible from any point on the globe.

[For more on Fixing the Reference Data Problem, see Allan Grody's related story.]

This is also the platform upon which the CCDM will be built, as an extension of the GLEIS (Global Legal Entity Identification System), first adding additional reference data for operationalizing the LEI for use in business applications within financial institutions (and regulators); followed by the UPI; then the FEI (Financial Event Identifier) for corporate control changes or reorganization events affecting the LEI and UPI; and, finally organizing the UTI (Unique Transaction Identifier) for providing a computer audit trail for all reference data and financial transactions using this data.

[To hear about how financial firms are managing their complex data architectures, attend the Future of the Financial Services Data Center panel at Interop 2014 in Las Vegas, March 31-April 4.
You can also REGISTER FOR INTEROP HERE.]
What is the current state of the GLEIS?

To date there is an ‘interim GLEIS’ being rolled out under a data file sharing arrangement. Preliminary LEIs (Pre-LEIs) are being assigned by preliminary (pre) LOUs for use in swaps data reporting. Each pre-LOU is to transmit a complete file and a daily change file to each of the approximately 10 pre-LOUs that have been authorized to issue pre-LEIs. There are another 15 pre-LOUs identified but not yet authorized to issue pre-LEIs. The common set of data attributes have still to be defined for the interim GLEIS even though the pre-LEIs and UTIs are already being used in mandated swaps reporting.

A number of issues have surfaced – poor data quality, other proprietary identifiers used in place of pre-LEIs, non-standard UTIs, multiple data formats used in reporting swaps transactions, and, finally no capability to ingest nor aggregate this data by regulators. In this state it is unknown how many duplicate pre-LEIs have been issued, if any, although the probability of such duplicates is high. Both the CFTC and the FSB have presented consultative papers recently requesting comments on these issues from the industry (see the CFTC’s Request for Public Comment on its Swap Data Reporting Rules and the G20’s Financial Stability Board’s similar request in its recent consultative paper on Aggregation of OTC Derivatives Data . Both consultative documents suggest an opportunity for a coordinated global industry response.

-Allan D. Grody is President of Financial Intergroup Holdings Ltd. He is the inventor and organizer of the CCDM.

Allan is President and founder of financial industry joint venture development company Financial InterGroup Holdings Ltd; and strategy & acquisition consultancy Financial InterGroup Advisors. The companies are engaged in the capital, contract, currency, cash and investment ... View Full Bio
Comment  | 
Print  | 
More Insights
More Commentary
Shore Up Cyber Security Now
Knowing that a data breach can and will happen at some point, asset management firms can manage new operational and regulatory risk with a layered approach to cyber security.
Is Big Data a Problem or an Opportunity?
When it comes to data, financial services firms are, as a rule, quite circumspect. They fear cyberattacks, data theft, data loss, security breaches, data privacy, and human error.
Data Integrity: A Necessity, Not an Option
Financial institutions that have taken on the data integrity task in the past now have to spend more money on hardware, software, and people just to keep up with the demand.
What Colombia’s New IT Campaign Means for Latin American Tech Investment
Colombia’s campaign is the latest example of how Latin America is trying to edge into the global technology space.
Initial Margin: When Does More Turn Out to Be Less?
Changing margin regulations are set to affect the OTC derivative market, including initial margin risk models for non-cleared OTCs.
Register for Wall Street & Technology Newsletters
White Papers
Current Issue
Wall Street & Technology - July 2014
In addition to regular audits, the SEC will start to scrutinize the cyber-security preparedness of market participants.
Video
5 Things to Look For Before Accepting Terms & Conditions
5 Things to Look For Before Accepting Terms & Conditions
Is your corporate data at risk? Before uploading sensitive information to cloud services be sure to review these terms.