The onslaught of new regulations and pressing calls for greater transparency from investors following the financial crisis has changed the way firms are addressing data management.
As the U.S. struggles to emerge from its worst economic crisis in 80 years, capital markets organizations have been forced to re-examine the way they report financial information. Regulators are continuing to ramp up scrutiny of exchanges, brokerages, hedge funds and every financial firm in between. What has become crystal clear, in light of post-financial-crisis economic conditions as well as a slew of recent high-profile technology glitches, is that the tolerance for risk is at an all-time low.
Today, financial institutions realize that electronic trading risk doesn't exist in a vacuum, suggests Neal Goldstein, managing director of electronic execution at JPMorgan and co-chair of the FPL Americas Risk Management Working Group, which recently published a set of risk guidelines for the global financial industry.
Treasurers, CFOs and chief investment officers need to understand risk in real time. As such, firms are increasingly focused on cross-asset aggregation of risk exposure. "At the large bulge bracket banks, clients often have trading relationships across multiple desks and lines of business," Goldstein says. "The holy grail is to be able to quantify a client's aggregate risk exposure across asset classes. This is particularly important if these risk checks may result in a kill switch being applied when a client's trading position exceeds a given exposure threshold." For instance, without a cross-asset class view of risk, a broker wouldn't know that a client was overexposed to a certain type of debt.
Larry Tabb, CEO and founder of advisory firm TABB Group, agrees: "It's about aggregating terms and positions, analyzing what they're worth, trying to create a value on them, doing analytics on them. And trying to see what's the sensitivity if interest rates or market conditions change and understand how this all fits together with your financial conditions. All these rules are intrinsically intertwined with data management and data clarity."
To really be able to assess risk clearly across all asset classes and geographies, banks need to have disparate systems that can publish an open order and execute trade information to a central system, Goldstein states.
Meanwhile, given tighter capital rules, investment banks have been forced to slash costs and shed thousands of jobs across all departments. Technologists haven't been spared. As such, firms are looking for more efficient ways to address their IT issues, including data management processes. "They are looking into new ways of approaching macro data management," Tabb says.
Some financial institutions have started focusing on higher-grade analytics to try to find patterns in data, Tabb notes. One of their current priorities is trying to gain analytical insight into unstructured data, with open source platforms such as Hadoop rapidly gaining popularity across the capital markets.
Firms also have been investing in areas such as reference data, particularly given real-time reporting requirements for swaps under the 2010 Dodd-Frank Act, which is aimed at preventing banks from taking risky bets for their own gain rather than on behalf of their customers, as well as the Office of Financial Research's role in establishing a global legal entity identifier for systemic risk management.
Since budgets aren't cooperating, firms are shuttering other initiatives and focusing on infrastructure and data infrastructure, says Tabb. "To a certain extent they're trying to rob Peter to pay Paul."
In the current environment, smaller is sometimes better. For instance, firms are no longer spending money on gigantic warehouses and agreeing on a single data model that would fit all the data in one place, says Amir Halfon, CTO of financial services at MarkLogic, a provider of database technology. "It's that aspect that introduces the most cost. The traditional approach of a big data warehouse doesn't cut it anymore."
As they strive for greater efficiency, firms are looking at more messaging-oriented data management, where the reference and infrastructure lie in the systems of record, and they then pass that information to subsequent systems instead of trying to create a complete set of terms and conditions for everything within the organization, Tabb explains.
"Those things get massive and hard to cost-justify," Tabb says. "You can invest in making your fixed income system as robust as possible. And then if the equities system needs some fixed income data, you can pass that data across and let the fixed income system be where the data of record lies." As a result, more work is being done on the messaging infrastructure "so that firms no longer have to build those massive databases that are almost obsolete by the time they're created."