Data Management

02:15 PM
Louis Lovas
Louis Lovas
News
Connect Directly
RSS
E-Mail
50%
50%

Data Management in Capital Markets Welcomes the New Year

Innovation is on the rise for consuming, processing, analyzing and storing data and ultimately mining actionable insights - diamonds from mountains of coal

Louis Lovas, Director of Solutions, OneMarketData
Louis Lovas, Director of Solutions, OneMarketData

It’s the beginning of a new year, a time when most of us pause in reflection, if only for a moment to consider what might have been and a future yet to come. Paths chosen, actions taken all plot a course ahead, however intended or fortuitous. This applies to our lives, careers and technological innovation fueling the engine of finance.

We live in the dissonance of a world shaped and driven by technology of ever increasing complexity. And finance is one of those few industries where technology takes center stage in meeting bottom line profitability goals. This unique partnership between the business and technological innovation is a result of the increasing sophistication of trading.

Returns are a dish best savored hot. A firm’s ability to manage money for effective returns requires continual investment to stay one step ahead of the competition. As Art Cashin recently said, "Because you're not in the business of collecting money. You're in the business of managing money”. The recipe is a mix of intellectual prowess, the latest in technological innovation, opportunity and a bit of luck.

At the center of the technology-driven world is data. Data and its broad accessibility have produced dramatic changes from reading glasses to Google glasses from dBase to Hadoop. The information age is transforming into the Data Age. Financial institutions are awash in data, it’s coming at them from every direction ebbing and flowing like the tide, understanding it is a game changer. Innovation is on the rise for consuming, processing, analyzing and storing data and ultimately mining actionable insights - diamonds from mountains of coal. Below are handful of reflections and expectations for the coming year on this broad category.

Coming to terms with Big Data, One Database (Architecture) Does Not Rule Them All

Much has been written about Hadoop in the past year. The sharp intersection of numerous factors has the industry all abuzz about this data management technology, to name a few… The meteoric rise in big data and social media’s influence on capital markets; consuming, storing, analyzing fire-hose volumes of pricing data across fractured markets and asset classes; a recognition that understanding data is a game changer and not least of which is a strong desire to better manage technology costs. In the data-centric world of capital markets, these intertwining factors feed a trading firm’s desire to hunt for alpha anywhere it can be found.

Yet to maximize data’s effectiveness is to achieve time-sensitive business goals. A one-size-fits-all approach to data management is a fool’s errand. As a basic utility, all data management should adhere to both durability and consistency, well-known ACID properties. Beyond that, content is king. In many cases, content can be time-sensitive and query latency is directly correlated to actionable decisions. That timeliness falls along a continuum based on industry and business goals. Trading demands microsecond decision times and data access has to achieve that performance profile.

All data has structure and falls along a spectrum from tightly defined to amorphous content encompassing textural dialects and slang. Likewise, data has mutability characteristics. Managing retail inventory demands highly transactional systems, and possibly the in-memory variants for better performance. Conversely, equity trades or most any asset class represent immutable real-world events occurring at tens of thousands to millions of times a second. Consuming, storing and analyzing that fire hose requires lock-free systems for the highest throughput and minimum latency.

As business goals push for more efficient and effective use of data, the coming year will bring a realization that a fit-for-purpose approach based on content, usage and time-sensitivity dictate the characteristics for optimal data management. Hadoop will continue to gain traction, but Nirvana for data management it is not. Yet the year will bring an understanding of its ideal fit alongside relational, in-memory and time-series (tick) database management systems.

Complex Event Processing (CEP) Technology finds its place in the industry

In physics, the second law of thermodynamics or entropy, states that all systems seek to evolve to either minimize their energy or maximize their disorder. It is the idea that a thing is made up of constituent parts (like grains of sand) and increasingly moves from order to disorder.

The software industry, one could argue is anti-entropic moving from disorder to order. Early stage technology is often marked by dozens of vendors with competing products jockeying for brand awareness and market share (i.e. disorder). Eventually, Gartner’s Hype Cycle runs its course - a new software innovation achieves fashion-like hoopla that inevitably leads to disillusionment and eventual coalescence around a common understanding and standardization (i.e. order).

This past year marked two events on Complex Event Processing’s move from disorder to order - TIBCO’s acquisition of Streambase and Software AG rescuing Apama from Progress Software. It marked the end of any notion that Complex Event Processing (CEP) is a standalone platform. Both vendors will bundle or otherwise incorporate their newly acquired prize into their existing software stack of messaging, grid, data and visual tooling.

CEP is a story of the disruptive power of innovation. It unfolded as new breed of technology for understanding data offering an efficient means to process data, specifically temporal analysis of time-series (streaming) data. It excels at filtering, enriching, transforming and aggregating data. CEP is a temporally-sensitive programming paradigm designed for calculating and extracting meaningful analytics that are unique to and dependent on data’s temporal nature.

Time series refers to data that has an associative time sequence, a natural ordering to its content such as FX rates, prices, curves, index compositions and so on. The data’s temporal ordering allows CEP to perform distinct analysis extracted from history and/or real-time sources revealing unique observations, patterns and predicting future values. Complex Event Processing is a mere subset of the greater technological advancement dominating the industry. It has always been about data - its management and analysis. The industry has now coalesced around that notion.

Register for Wall Street & Technology Newsletters
Video
5 Things to Look For Before Accepting Terms & Conditions
5 Things to Look For Before Accepting Terms & Conditions
Is your corporate data at risk? Before uploading sensitive information to cloud services be sure to review these terms.