As with any large technological disruption, hype has a way of getting far ahead of industry adoption. This is especially true with big data, the biggest buzzword to come along since we scratched our heads over the concept of cloud computing.
To date, virtually no financial services business has managed to mobilize big data platforms as an expansive, end-to-end enterprise application. The promise of big data didn’t take into account the needed groundwork and cultural adoption or consider the shortage of experts to churn the data and ask the right questions. As a result, Gartner updated its prediction that big data adoption won’t mature in mainstream financial services from three to five years to at least five, and more realistically up to 10.
Trial And Error
Given the often broad definition of big data and its use, nearly every firm claims to have some sort of pilot underway. Bigger companies are often taking traditionally structured database products and labeling them as big data products, and vendors come to market with “big data” packages that may use analytics but not necessarily big data to make decisions. “I’m hopeful that in time the definition will start to settle down, that people will stop being so breathless and confused about it,” says James Cantarella, business development manager at Thomson Reuters. “It’s simpler and more powerful when we understand what it is.”
“Social media-based modeling is brand new, and there’s clearly a huge interest in its conjunction with mean reversion and momentum strategy.” — Louis Lovas, director of solutions at OneMarketData
Gartner estimated total spending on software, social media, and IT services related to big data and analytics neared $28 billion in 2012, a figure likely dwarfed by spending in 2013. For all that interest and investment, some notable progress has been made. Examples of point applications are starting to emerge that help to maintain the hype and fortify recognition that data analytics mastery has already become critical to competitive success. Just how far big data has taken off in financial services today is still difficult to say, as presumably many success stories go unreported by firms hoping to protect their special approaches.
“I don’t believe success is systemic,” says Philip Brittan, CTO and global head of platform for financial and risk at Thomson Reuters. “When people can put data in a Hadoop cluster and run interesting analysis on piles of data, they see all kinds of value. These are the basic building blocks. It’s nothing you can take out of a box and it works or does not work. It’s a scope that firms can use to build off of — some will use the scope to do valuable things, while others will encounter failures.”
Taking The Plunge
There’s no shortage of applications that can be tested, so the more important issue becomes how you get to bottom-line value, says Sean O’Dowd, capital markets program director at Teradata. “Firms are building data labs and discovery operations to put money where their mouth is, to execute against the promise that has been hype and headline for the last couple of years.” He adds that many of the new capabilities put out by vendors remain untested at this point in the adoption cycle. “Until you harden the products with real-world operations and use them in the field, you don’t get the balance between a best-fit solution and market need, and we’re going through that right now. There’s nobody who has the full answer; there’s just not enough utilization in the market now, but it is coming.”
“I’m going to figure out how to capture relevant data and use it in real time for better performance and transfer at lightning speed to keep me ahead of everyone else.” — David Meitz, CTO at the research broker Investment Technology Group
Perhaps big data’s progress is not unlike how the cloud moved forward. Salesforce.com, for example, has become exponentially smarter in meeting the needs and demands of customers than it was even five years ago.
To be fair to the big data market, still in the beginning phase of execution in the technology cycle, software providers are going to dangle out many new ideas and capabilities. Some are adopted, some die on the vine, and some are far ahead of their time.
For banks and financial institutions, in terms of realizing the value and testing the idea of big data, focus hasn’t been so much on the newer data sources as on mining internal data sets, particularly in the area of regulatory compliance.
“Where rubber meets the road, there’s been a great deal of activity trying to understand how to reduce fines or get ahead of infractions before they happen,” explains O’Dowd.
In light of heightened regulation, it’s not unreasonable for a diversified financial firm to appropriate upward of a quarter-billion dollars per year to manage fines and regulatory penalties. And it wouldn’t be atypical for the outfit to appropriate millions more toward settling communications fines. As a result, big data solutions are often applied to break down internal text and voice communications, converting unstructured data into usable formats for compliance to run against analytics to detect patterns that could result in violations and fines. Over time, the functions are able to raise red flags in time to avert infractions. In one example, a company’s analysis of internal communications resulted in $4 million in reduced direct labor costs (manual reconciliation) and a $9 million avoidance in fees to regulators.
A great number of big data use cases in the capital markets are leveraging incoming data streams that help traders make more-informed decisions. As an execution provider, David Meitz, CTO at the research broker Investment Technology Group (ITG), understands that regardless of whether information given to a trader is from a historical query or a real-time data feed, the ability to deliver appropriate and trusted information within microseconds can mean the difference between profitable and unprofitable trades.
“As a trader, that’s the big data value,” says Meitz. “I’m going to figure out how to capture relevant data and use it in real time for better performance and transfer at lightning speed to keep me ahead of everyone else.” To that end, ITG’s leap into big data includes a dynamic cache that updates structured and unstructured data in real time. The functionality gives traders updates on their performance throughout the day so they can make more informed decisions, including how that last execution went and what the cost, time, and fees were. Five years ago, that would have been part of the post-trade analysis.
“Where rubber meets the road, there’s been a great deal of activity trying to understand how to reduce fines or get ahead of infractions before they happen.” — Sean O’Dowd, capital markets program director at Teradata
“All of this I can monetize, and it has real value to traders,” Meitz says. “We used caching to electronically scrub data and to decide what to keep; but the reality is, you don’t know what changes are going to come into play. Maybe there are elements that can be used to improve performance or data analytics. Whatever it is, big data now has a place. For ITG, big data has gone well beyond the problem stage and turned into huge opportunity for us.”Becca Lipman is Senior Editor for Wall Street & Technology. She writes in-depth news articles with a focus on big data and compliance in the capital markets. She regularly meets with information technology leaders and innovators and writes about cloud computing, datacenters, ... View Full Bio