The crash of 1987 happened 25 years ago when the markets were far less automated, but in many ways, that day set a precedent for the Flash Crash and other technology glitches that have recently bedeviled the stock market.
Computerized trading was the scapegoat on this infamous day, known as Black Monday, and today many are blaming high frequency trading for volatile price swings in stocks, along with a lack of market making obligations.
There's also a ton of debate these days about how circuit breakers should function and how to define the obligations of market makers in an environment where high-frequency traders supply most of the liquidity. But what lessons can be drawn from the Black Monday crash? And could a crash of this magnitude happen again?
On that day, the Dow Jones Industrial Average plummeted 508 points, sapping 23 percent of the market's value, easily besting the 12.3 percent plunge in 1929 that's credited with setting off the Great Depression.
The culprit back in 1987 was a computerized trading strategy known as portfolio insurance. This was a so-called dynamic hedging strategy that involved using stock index futures and options to protect institutional investor holdings from price declines.
The idea was that money managers would sell ever-increasing numbers of futures contracts to offset losses in their stock holdings. The product was heavily marketed to institutional money managers and probably generated high fees for its creator, Leland O' Brien & Rubinstein & Associates (LOR). But they ultimately closed up shop.
And while the strategy was tied to academic research, it was never tested in a turbulent market decline.
As Floyd Norris writes in today's New York Times :
Portfolio insurance did not start the widespread selling of stocks in 1987. But it made sure that the process got out of hand.
Computers determined how many stock index futures contracts needed to be sold for each money manager.
The problem was that firms who were buying the futures wanted lower prices, and they were hedging that exposure by massively selling off the underlying stocks. This in turn drove down prices and triggered more sell orders from the computers, notes the Times story.
In the aftermath of that crash, program trading became a whipping boy. It was program traders who had sold stocks as futures priced plunged, and a lot of traditional portfolio managers wrongly believed they had caused the debacle.
Reflecting on Black Monday and its relevance to some of the recent disasters, Norris points to the excessive belief in computers as "the beginning of the destruction of markets." He contends that people are blindly following their computers without understanding their limitations. Such ignorance was on display during both the recent Knight Capital episode, and the Flash Crash.
During the Flash Crash, there was a 'fat finger' error by a buy side trader that accidentally caused thousands of contracts in S&P 500 e-Minis to be executed at one fell swoop through a broker's algorithm, rather than pieced out.
However, many market participants don't believe this error had anything to do with the Flash Crash and point to the fragmentation and different network trading speeds and rules governing the nation's 13 stock exchanges. But the NYSE stood out because it let its specialists halt trading and the other markets did not follow.
[Can Industry Collaboration on a Kill Switch at the Exchange Level Help Prevent Technology Glitches?]
More recently, Knight Capital's error in releasing a test version of its software nearly toppled the firm, and has raised questions about the lack of human oversight and risk controls. This time, the NYSE continued to trade until Knight pulled the plug.
With both occurrences in mind, MarketWatch's Mark Hulbert argues that a repeat of the 1987 crash is inevitable. At the current market levels, he contends that such a decline would wipe 3,000 points off the Dow Jones Industrial Average in a single session.
The market in 1987 was mainly floor-based, with specialists who would buy when investors were selling, and sell when investors were buying. But today's liquidity providers are HFT firms that aren't responsible for making markets when everyone is panicking.
The speed and connectedness of today's buyers and sellers is so much faster, that if a crash occurred, it could be more devastating.
That is why exchange officials resort to circuit breakers if a stock moves more than 10 percent, and they are looking to implement limit-up/limit-down. The NYSE has other tools at its disposal such as shutting off electronic trading in a stock and conducting an auction to stabilize it.
The crash of 1987 pointed to other flaws in our system. On that day, the tape was hours behind in updating prices, so traders didn't have anything to see, which may have contributed to their panic.
Nasdaq market makers were not picking up their telephones to take customers' orders, even though they had obligations.
Over the past 25 years, reliance on computers has made the market more democratic. Today, there is much more transparency— investors and traders have access to real-time prices via the Internet and may subscribe to Level 2 feeds showing all bids and offers for every stock across the market.
While there are still complaints about HFT firms gaining an edge with the faster feeds and colocation near the trading venues, those tools are available to everyone who can pay for it. Today, the concern is bringing out new order types or algorithms that haven't been adequately testing and can bring more unintended consequences because the market is more complex and wired than 1987. The challenge today is to find the right tools, so that when the system is shocked, it can be steadied again.