Compliance

06:47 PM
Connect Directly
RSS
E-Mail
50%
50%

An Ounce of Prevention is Worth a Pound of Cure

Even with the new SEC regulations, US brokers have a lot of work to do to comply with market access rules.

Last month, Reuters reported that regulators from the Financial Industry Regulatory Authority (FINRA), an organization funded by the industry to watch Wall Street, stated, “some brokerages still do not have proper buffers in place to protect against technology errors or rogue algorithms which could rile markets, although many have improved their compliance with new rules.” These market access rules were put in place after the May 2010 “flash crash” to mitigate the impact of erroneous and non-compliant trades.

What is a “flash crash”? It is when a market suffers a dramatic drop in pricing in a very short period of time as result of some kind of system- or trading-related error. On May 6, 2010, the Dow Jones Industrial Average lost an incredible 1,000 points – nearly 10 percent of its value – with a loss of 600 points in just five minutes. While most of the 600 points were recovered less than 30 minutes later, the 2010 flash crash was a clear example of a new risk in the market due to the acceleration of computerized trading and potential computer trading malfunctions. The concept of “technology risk” was added to the lexicon, and the task of dealing with it to the already full plate of the Chief Risk Officer.

In response to that crash, new market access regulations were implemented, including a modernization of circuit breakers and rules for breaking erroneous trades, while also adding stronger limitations on minimal quoting standards. To prevent the trade of an individual security from occurring outside a specific price range, the Securities and Exchange Commission (SEC) in 2012 approved limits on trades at prices above and below a certain percentage over the average price during the previous five minutes.

This “single-stock” circuit breaker is combined with a market-wide circuit breaker that halts trading in all exchange-listed securities, and it can be used to respond to potential flash crash scenarios. The SEC also implemented new rules to outline when, and at what price, erroneous trades would be reversed. Finally, to improve quoting standards, it included a requirement to have two-sided quotations within a certain percentage band of the best bid and offer.

According to FINRA, even with these new rules in place, there is still a lot that needs to be done by US brokers to comply with market access rules. The most common problems documented by the examiners were:

  • Lack of proper supervision
  • Inadequate pre-trade capital or credit thresholds
  • Allocating certain risk management controls to their customers without proper documentation
So what is really needed?

A recent area of focus has been on more powerful risk controls to provide more intelligence at the point of trade, and finer-grain intervention in the event of a problem. Rather like a domestic fuse board one might find in the home, such controls would monitor activities of human traders and computer algorithms and when there’s a problem would immediately “trip the switch” to break the circuit and isolate any miscreants, whilst allowing other parties to carry on unhindered. After all there’s no point switching off the refrigerator and spoiling the food is it’s only the lights that have blown!

To provide such fine-grain intelligence requires understanding the context of trading activity. For example, is an algorithm submitting trades at a higher frequency than normal? Is it submitting trades too far away from the market, or at an unchanging price? Is a trader dealing in a stock they don’t normally trade? Or out of usual trading hours? To make these kind of decisions requires combining real-time events with historical data to enrich the decision making process – and to do so instantly. There’s no point in tripping the switch after the house is on fire!

Across the spectrum of risk management, increasing amounts of data need to be analyzed in decreasing time windows to make smart and timely decisions. In managing client margin, the ability to set client trading limits in real-time by marking-to-market existing cash and asset holdings, across the entire portfolio of products held, is becoming a fundamental requirement of effective risk management. In post-trade monitoring, monitoring the real-time impact of trading on market price and volume is becoming an effective weapon in detecting errant trading, and will become a standard component of a multi-layered risk strategy.

To provide the required intelligence – and deliver it instantly at the point of trade – requires bringing data and analytics together in real-time. Risk management has long been a “big data” problem, but has become a fast big data problem. A combination of real-time event processing combined with in-memory data management, capable of delivering huge volumes of contextualizing data to the point of decision, is becoming the default platform for modern pre- and post-trade risk capabilities.

Investing in these capabilities is no longer optional. Last year, a software error at Knight Capital created millions of incorrect orders over a 45-minute period, leading to an eye-watering $440 million in losses (not to mention the $12 million Knight was subsequently fined by the SEC). This is clearly an example when adequate real-time pre and post trade risk controls were absent. Even in the unlikely event that such erroneous trading slipped by even moderately smart pre-trade risk controls, real-time post-trade impact analysis would have flagged up the problem and tripped the switch before too much damage was done. As it was, Knight almost went bankrupt and had to be rescued by its industry brethren, and the rest is history.

Preventing a problem like Knight Capital’s from occurring in the first place saves a great deal more in time, effort and cost than trying to repair the damage done later. Clearly, Ben Franklin was wise before his time, noting, “An ounce of prevention is worth a pound of cure.”

Dr. John Bates is a Member of the Group Executive Board and Chief Technology Officer at Software AG, responsible for Intelligent Business Operations and Big Data strategies. Until July 2013, John was Executive Vice President and Corporate Chief Technology Officer at Progress ... View Full Bio

Register for Wall Street & Technology Newsletters
Video
Stressed Out by Compliance, Reputational Damage & Fines?
Stressed Out by Compliance, Reputational Damage & Fines?
Financial services executives are living in a "regulatory pressure cooker." Here's how executives are preparing for the new compliance requirements.