Since the earliest days of the financial markets, information has been a key element of success. In the past, market information was conveyed by methods that now seem quaint: carrier pigeons, personal conversations, printed materials sent through the post.
Today more data is generated in a 24-hour period than in entire centuries of the past, traveling at lightning speeds to all corners of the globe. Accessing, sorting, compiling, and leveraging that information is increasingly important in fast-paced markets and changing regulatory landscapes.
In this data economy, all kinds of businesses, from online retailers to pharmaceutical giants, are mining a wealth of information to better serve their customers, stay ahead of rivals, and improve the bottom line. The task is no less crucial in financial services.
The term “Big Data” has been used in a variety of ways, applied to everything from traditional relational databases to web-based sentiment-analysis tools. Just remember the three V's: the increasing velocity, volume, and variety of information available from a growing range of sources. All those bits and bytes only add up to something when they’re organized, arranged, and made coherent.
[Do you aspire to the C-suite, or some other spot in upper IT management? Then bulk up your credentials around today's most pressing IT movement, digital business, at the InformationWeek IT Leadership Summit.]
Not all analytics or data processing is big data. Trading and securities processing technologies have long been able to scale to meet the increased flow of electronic data resulting from market-structure change and increased electronic activity; high-speed trading is a good example. Complexity doesn’t necessarily mean big data. But big data is almost always complex -- which means it requires intelligent solutions if its potential is to be tapped.
Big data is directly tied to the rising importance of information management as a function within financial institutions. Regulatory, client, and internal drivers have forced most firms to reevaluate the core reference datasets on which they base their trading, risk management, and operational decisions. The proliferation of C-level executive positions dedicated to championing data management and data governance is a sign of this enhanced focus. Still, the majority of firms don’t have a big-data strategy in place across the enterprise. And few that do are equipped to manage the available data by themselves. That’s where firms such as Thomson Reuters can help.
As things stand now, big data can be very useful in analytics for trading and quantitative research, both linked to revenue generation. An increasing number of firms are attempting to gain insight from unstructured sources such as Twitter, news sites, and blogs while mining internal datasets. Our guidance and tools help clients sort, connect, understand, and leverage this data.
For example, through our vast Legal, Patent, and Life Science databases, we know exactly where any given drug stands in the FDA approval pipeline. That can have a huge impact on how the stock of a small pharma or biotech company dependent on those drugs performs. Similarly, we collect real-time data from satellite imaging systems and combine it with weather and historical ag data to make early predictions on crop yields -- indispensable to any company engaged in the commodities markets.
Next page: Big-data challenges – and getting past themDebra Walton is Chief Content Officer at Thomson Reuters, leading a team of 4,000+ employees across the globe that are responsible for setting the strategy and managing the operations of Thomson Reuters vast data and intelligent information resources. She has been a member of ... View Full Bio