Data Management

01:15 PM
Branden Jones
Branden Jones
Commentary
Connect Directly
LinkedIn
RSS
E-Mail
50%
50%

Persistent Automation for Fund Management

In this age of data management, operational models must be able to house, curate, and level-off information sets as they happen.

This is the year for big data. Across industries, firms have unprecedented amounts of both public and private information sets -- from user profiles and consumer habits, to business outputs and proprietary algorithms. But access to data, or information at large, does not guarantee a valuable yield. Jonathan Shaw, managing editor of Harvard Magazine notes, “The [data] revolution lies in improved statistical and computational methods, not in the exponential growth of storage or even computational capacity.” Data is ubiquitous but not intrinsically valuable. It needs to be smartly processed, not just farmed.

For hedge funds, data processing is the quiet, invisible process that moves through the trade lifecycle -- accessed from external entities like exchanges and brokers, modified and adjusted in execution, and at times, frozen in snapshots for an increasingly complex group of investors and regulators. More operational credibility and regulatory compliance is required than ever before, with increased scrutiny of the secret buy-side manna that goes along with it.

Smarter data management can be expensive and time-consuming as funds seek to keep up with regulatory, compliance, and transparency requirements. Good fund management starts and ends with precise, accurate data management and understanding a whole new data ecosystem, with new methods of processing, through selective automation and augmented observation. 

Lifecycle convergence
While data management has historically been the purview of three separate functions (front-, middle-, and back-office), funds are now considering data inflows and outflows as simultaneous and holistic activities that not only govern market data and transparency capabilities, but also the capacity to be position-aware. According to an Aite report from earlier this year, “…regardless of whether firms currently outsource or plan to outsource, the most common impressions of the benefits of using a single front- to back-office vendor for fund operations revolve around the attractiveness of holistic functionality, the expected contribution of a specialized vendor’s experience gained from other firms, and the vendor’s potential to better service clients.”

Essentially, funds are approaching operations as an ecosystem, instead of a train-like pipeline where only one train moves in one direction. The ecosystem houses converging cross-office data functionalities that are near-simultaneous activities, beyond the linear progression of the traditional lifecycle. Risk is moving to the front office. Portfolio management is constant. And compliance is everywhere. No longer does the pre-data model from the 80s and 90s work -- non-computational and hindered by actual human movement, where data moves in a single line, waiting in turn to be moved in and out of an outdated fund architecture by personnel who may or may not exist in today’s hedge fund reality.

The data map has changed: It’s time for a new hedge fund model.

The new data model
In this age of data management, operational models must be able to house, curate, and level-off information sets as they happen. Funds must not only actively manage a growing universe of market data but also tackle performance reporting, risk projections, disaster planning, and partitioned client data.

To successfully, and simultaneously, manage these activities, funds must have a data operational model that supports automation, including:

Processing. Real-time, continuous actions are the new normal in today’s hedge fund reality. As pressures increase from both investors and regulators, managers should rely on continuous, automated services, processes, and technology to support their businesses, not only as a viewable segment, but constantly, throughout the lifespan of the fund.

Normalization. Normalization is the process that guarantees safe passage of these data packets, regardless of origin, as the data becomes available to converge with its intended destination(s) within the fund infrastructure. Consistent data, through consistent ongoing normalization, translates into accurate pricing and valuations for use in real-time and forward-looking portfolio management, as well as precision analysis and reporting for investors.

Historical. The need to investigate and use historical, security-level data unique to the fund is a key to the success of the business. Arming a fund with since-inception data allows the manager to transform the most unique and granular drivers of past performance into the underpinnings of practicable, forward-looking initiatives across alpha generation, risk management, investor insights, and compliance.

Defense. While data trafficking, shaping, and viewing are relatively benign activities, when it comes to true data management, a fourth component is critical: the ability to uncover and recover from adverse events and the greater protection of investor interests. Cloud technology provides the best option for funds to house data infrastructures -- providing, not only secure and convenient access, but also automated virtual warehouses and backup systems, shielding the business from any physical hardware environmental risks like earthquakes, floods, or outages. Thus it’s important, not only how data is managed, but where it is managed.

As Global Head of Marketing for Liquid Holdings Group, Branden drives brand awareness and market adoption of Liquid's new approach, providing hedge funds with the best way to de-risk their business, enhance decision making, improve transparency and ultimately put more money ... View Full Bio
Comment  | 
Print  | 
More Insights
More Commentary
Data Integrity: A Necessity, Not an Option
Financial institutions that have taken on the data integrity task in the past now have to spend more money on hardware, software, and people just to keep up with the demand.
What Colombia’s New IT Campaign Means for Latin American Tech Investment
Colombia’s campaign is the latest example of how Latin America is trying to edge into the global technology space.
Initial Margin: When Does More Turn Out to Be Less?
Changing margin regulations are set to affect the OTC derivative market, including initial margin risk models for non-cleared OTCs.
The Mainframe Innovation Drag
It may be time for a consortium of firms motivated around the objective of eliminating the mainframe. What if every self-clearing firm decided to participate in building a modern, back-office system as an open-source, cloud-based project?
Big Data DIY
Now that we have passed the initial hype phase of big data, companies are searching for real business value from their investments. Consultants can play a part, but only if financial firms insist on a new partnership model.
Register for Wall Street & Technology Newsletters
White Papers
Current Issue
Wall Street & Technology - July 2014
In addition to regular audits, the SEC will start to scrutinize the cyber-security preparedness of market participants.
Video
5 Things to Look For Before Accepting Terms & Conditions
5 Things to Look For Before Accepting Terms & Conditions
Is your corporate data at risk? Before uploading sensitive information to cloud services be sure to review these terms.