Data Management

01:38 PM
Frank Piasecki, ACTIV Financial
Frank Piasecki, ACTIV Financial
Commentary
50%
50%

The Drag of Legacy Enterprise Systems: Market Data Efficiencies Take Center Stage

Change is one of the only constants when it comes to market data, and firms that take a reactive stance to changing market data dynamics face higher risks and costs down the road. Market-data-as-a-service is the model of the future. Is your firm planning for the future or focused on optimizing legacy models?

Frank Piasecki, ACTIV
Frank Piasecki, ACTIV
Today’s financial markets are defined by continuous change in nearly all aspects of the business cycle. This means there will be a perpetual stream of disruptions that occur within enterprise market data models if they remain stagnant. Big issues facing today’s firms include new regulatory involvement in core technology of markets and trading systems, OTC derivatives in the U.S. and European markets, and the impact of the Fed’s quantitative easing of monetary policy and its implication for volatility and volume across all markets. Among the most significant and consistent concerns for the industry, from an operational and business performance point of view is how to manage increasing volumes and sources of data across asset-classes, in real-time, without seeing network capacity and latency investment costs soar.

New content requirements, message rate growth, regulatory mandates, broader geographic scope, and a growth in functional application requirements for data consumption have reinvented the traditional role of the enterprise market data professional in finance. These issues have also pushed the industry to embrace a new type of market data delivery model, one we call market-data-as-a-service, which is defined as a fully integrated, managed service that allows customers to reap the benefits of market data consumption without the cost of ownership, and retain competitive pressure among competing service providers.

What makes enterprise market data management so challenging in the current environment is the deluge of new data types and sources that must be organized and managed in more physically dispersed locations than ever before. Data needs to move within the compute device, within data centers, over trading floors, remote offices, exchange co-lo’s, mobile devices, extranets, public clouds, private clouds and be consumed by a global user base. Can this highly dispersed IT problem be harnessed without constant and direct oversight?

Continuous change in business strategy and structure means data is being generated and consumed from within and outside the organization. Each type of data will come with its own unique demands from both internal and external customers who want greater flexibility and control across enterprise market data systems. Internal content, customer communication and regulatory reporting are now real-time events, but we haven’t seen a system emerge as a viable solution to capture and manage them.

Real-time data has redefined itself in the past ten years, and because of the complexity of new systems, enterprises need high performance, flexible technology to manage data. IT groups need to be reformed to support all asset classes as well as new risk and compliance demands under global operating requirements.

Legacy market data systems are beginning to slow the enterprise’s ability to respond to changing market dynamics. Sticking with a legacy system may seem like the less disruptive approach due to the upfront costs involved in switching networks, hardware and application interfaces, but a new model is required to prevent unnecessary service disruption from peak volume and bottlenecks. Otherwise, firms with large, unmanageable systems that have been patched up and added on to over the years will face sub-par performance, leading to slower response times and diminished business intelligence, which can have a significant impact on the bottom line over time.

In an effort to improve performance of legacy systems, some firms, for a short time, began looking at deconstructing market data service. The unbundling and rebundling of data components is a technique attempted in the hopes of abandoning outdated market data services, but this method means someone has to run an operation 24/7/365 as an internal market data supplier, which adds costs without providing significant business advantage. Firms are beginning to realize that building their own, or continuing on legacy systems is a non-sustainable plan. Switching to a modern technology solution architected for peak volume and performance that wraps up infrastructure and capacity management, monitors availability, and removes capital expenditure, is not only ideal, but necessary. Yet some firms are hesitant to tackle projects of this size because of upfront switching costs and the potential need to modify existing processes and protocols. This thinking doesn’t properly account for or value the long term costs of inaction that can be much more significant to business over time.

[Big Data, Big Business, Big Confusion]

In the last few years we’ve seen the progression of market data services grow from dumb video screens to streaming web service access, to feeds that can integrate directly into your trading and risk applications. With these changes have come major IT management responsibilities with costs that are difficult to justify. Market-data-as-a-service streamlines and simplifies the collection and distribution of market data on a single platform, across all use cases: ULL, mobile web, enterprise…

We’ve also seen the industry seek independence from their market data suppliers. The trend today is for market data consumers to assume management of content without having to worry about maintaining the underlying infrastructure that delivers the content to end-user applications. So how do you achieve independence when the terminal delivery model is an insufficient transport? Instead of purchasing a vast quantity of data from vendors through terminals, firms need a service that manages content and the underlying technology in a unified platform.

The market-data-as-a-service model reduces overall costs by 50 percent and reduces internal infrastructure by as much as 90 percent. These savings come without diminishing data performance, and in most cases, performance is dramatically improved. So at the end of the day, market data professionals must take a good look at their total cost of ownership and decide what is best for the future of their market data management: the short-term, upfront costs associated to switching to a new, more efficient model, or the costs tied to a slow, long-term drag on performance that comes along with an outdated system.

Frank Piasecki is president and co-founder of ACTIV Financial.

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
IvySchmerken
50%
50%
IvySchmerken,
User Rank: Author
10/7/2013 | 1:11:06 PM
re: The Drag of Legacy Enterprise Systems: Market Data Efficiencies Take Center Stage
Yes, I agree, firms will resist reducing their own market data departments. But large FIs are going to be challenged by new data sets, i.e., OTC derivatives as those instrument start to trade on SEFs and new feeds become available. The challenges will only grow, not diminish. Experimenting with market data as a managed service could be tested in a small area or new trading desk. It doesn't mean the entire firm has to convert over.
Greg MacSweeney
50%
50%
Greg MacSweeney,
User Rank: Apprentice
10/7/2013 | 10:22:45 AM
re: The Drag of Legacy Enterprise Systems: Market Data Efficiencies Take Center Stage
It seems to me that the it would be difficult to outsource market data operations at a large FI. First, the market data legacy systems are linked to many legacy systems. Unraveling the systems would be a major challenge. Also, there is "institutional inertia." There are huge market data business units set up across financial firms. Outsourcing market data is a big threat to their existence. So, the the inertia is to keep doing what the firm has always done.
IvySchmerken
50%
50%
IvySchmerken,
User Rank: Author
10/5/2013 | 1:39:47 AM
re: The Drag of Legacy Enterprise Systems: Market Data Efficiencies Take Center Stage
Frank, great piece! While it seems like market-data-as a service is less costly and simpler, many large firms continue to run complex market data operations internally. Is this because market data is the lifeblood of Wall Street and they are afraid to outsource it? Or, is it more that market data legacy systems are integrated with applications and it would take a lot of work to switch to a new service?
More Commentary
Wall Street CIOs Have a Vendor Management Problem
If Wall Street CIOs want to stay ahead of competition and ensure high-speed trading software doesn't start the next flash crash, they need better insight into vendor delivered software.
Technology Innovation Returns to Financial Services
Capital Markets Outlook 2015: Following a few years dominated by regulatory compliance and cost saving technology initiatives, financial organizations are finally investing in innovative technology and tools.
Voice Biometrics Improve Transaction Monitoring Fraud Detection
Why voice biometrics should be a part of your fraud prevention strategy in the call center.
Fintech Fast Forward 2015
What will shape the future of Fintech in 2015 and beyond?
Look Deeper at Business Connections
When a business person or practice crosses the line, what should a professional do?
Register for Wall Street & Technology Newsletters
White Papers
Current Issue
Wall Street & Technology - Elite 8, October 2014
The in-depth profiles of this year's Elite 8 honorees focus on leadership, talent recruitment, big data, analytics, mobile, and more.
Video
5 Things to Look For Before Accepting Terms & Conditions
5 Things to Look For Before Accepting Terms & Conditions
Is your corporate data at risk? Before uploading sensitive information to cloud services be sure to review these terms.