It's hard to emphasize how important time synchronization is to financial services. When transactions are being conducted in nanoseconds across servers, applications, geographic regions, even a millisecond difference can mean the difference between a successful and disaster trade, and regulatory compliance.
"When did you receive the information," a regulator may ask when seeking a trade reconstruction. "When did you decide to make a trade, place an order, and have that order acknowledged?" If all clocks are out of filter you can not begin to reconstruct that timeline, explains Solarflare CEO Russell Stern.
[For more on time synchronization read: Check the Time: Majority of Firms have Time Synchronization Errors]
Solarflare has a bit of experience in this area. The firm has over 900 customers and 95% of market share in electronic trading. Every major exchange uses the firm's tech to make trades, which includes SolarCapture to accurately time stamp the data.
It may seem surprising that this is a problem to begin with, but one look at the time on a wrist watch, the stove clock, the microwave and alarm might be enough drive home the point.
So yes, time synchronization improves a firm's ability to stay on top of their transactions and stay in compliance, but Stern says this is just the beginning. "If you turn the crank even a little bit, you have ability to capture and now filter and manipulate data and therefore secure it in a number of ways like denial of service and intrusion protection services. You're able to attach our product to live threat analysis -- which are these data centers around the globe that monitor traffic on the internet and find where bad attacks are coming from. Their database collects bad addresses of threats and is updated on a millisecond by millisecond basis. One of the problems today is that firms often learn about attacks after the fact, with live threat analysis the hope is that updating address tables on real time basis will help filter the traffic."
From 1GbE to 10GbE to 40GbE
According to the sixth annual State of the Network Survey, 77% of companies will deploy 10 Gigabit Ethernet (GbE) throughout the next year. This is much faster than the standard 1GbE today. By increasing the Gigabit Ethernet firms increase their network's ability to carry more data, support transition rates, and decrease latency.
But industry experts are already looking well ahead, because given the ever-rising amount of data being collected and shared, 10GbE won't be sufficient for long. Firms will want to intelligently combine news sources from multiple markets, aggregating feeds together at a rate beyond 10GbE. 40 is the new magic number.
Most companies are still getting used to 10GbE, to the point where over 50% of servers are now deployed with 10GbE, explains Stern. There have been a couple barriers to growth, one of them being that servers themselves have to be able to consume that much data. "With that bottleneck broken we see more 10GbE adoption. 40 GbE is coming, and it will come faster behind 10 than 10 did behind 1 because the technology advances from 1 to 10 were greater than it is to get from 10 to 40. That 40GbEis actually 4 10GbEs sewn together, so the technology hurdles aren't as rustic."
Stern predicts the adoption of 40GbE will pick up in the second and third quarter of 2013.
"Firms need technology that help them more efficiently handle the trading volume in a way that can be monitored and regulated," concludes Stern. "It is extremely important element of this technology, and you see much more adoption and companies trying to address the issue. At the same time security issues loom on top of that. I think the market direction is to accelerate, monitor, and secure." Becca Lipman is Senior Editor for Wall Street & Technology. She writes in-depth news articles with a focus on big data and compliance in the capital markets. She regularly meets with information technology leaders and innovators and writes about cloud computing, datacenters, ... View Full Bio