The 2010 ‘flash crash’, which saw US stock markets briefly dive 9%, has led to increased buy-side scrutiny of automated trading systems in both the US and globally as the industry marks the fourth anniversary of the event.
On 6 May 2010, the Dow Jones Industrial Index suffered its second biggest point swing, suddenly dropping 1,010.14 points at 2.45 EST.
At the time of crash, markets were already depressed due to the Greek debt crisis and the Dow Jones had already fallen more than 300 points that day before rapidly declining over a period of just five minutes. However, just 20 minutes later the market regained much of its lost ground, leaving investors and regulators mystified as to its causes.
An investigation into the crash by the Securities and Exchange Commission (SEC) and Commodity Futures Trading Commission (CFTC) blamed the fragmented nature of the US stock market, aggressive selling by high-frequency traders and the impact of the sudden spike in volumes on the behaviour of trading algorithms.
Since 2010, there have been numerous market glitches, including one caused by a rogue market-making algorithm that cost Knight Capital over US$400 million, but doubts remain over whether regulator action will be effective in prevent similar problems in the future. Additionally, politicians and the general public are now increasingly demanding action to further reinforce US equity market stability and regulate high-frequency trading (HFT) following the publication of Michael Lewis’ book, ‘Flash Boys’, which details the rise of HFT in the US.
While retail investors are still worried about the impact of HFT and algorithmic trading on their investments, Mark Pumfrey, CEO of Liquidnet Europe, said institutional investors are better informed on the issues.
“The debate on market stability – and the role of HFT in particular – has been reignited recently, but it’s an issue market participants have been aware of for years,” he said.
Buy-siders have become increasingly demanding when using algorithms, requesting far more information on both the algorithm and the pedigree of the developers than ever before, according to Sang Lee, managing partner at consultancy Aite Group.
“There has been a realisation that the sell-side can’t just give clients algorithms to use: they need more support,” he said. “Similarly the buy-side is asking for a lot more information about how these algos actually work and brokers need to be able to provide this.”
Pumfrey agrees that institutions are looking more closely at the way they use algos than ever before: “The buy-side has evolved and trading desks are adding value by scrutinising algorithms to ensure they can generate alpha for their clients.”
Four years on from the crash, new regulations covering algorithmic trading and HFT are beginning to be introduced across many jurisdictions. Germany recently introduced rules to bring HFT firms under BaFin’s regulatory umbrella and require algorithms to be thoroughly tested before deployment. Similar rules are set to be implemented Europe-wide as part of MiFID II.
“Changes to ensure there is more testing of algorithms should help to improve stability. The people designing algorithms should have to meet a minimum level of competency. Markets today are so complex it is sensible to ensure there is some protection in place for investors,” added Pumfrey.
In the US, algorithms have yet to be directly scrutinised but safety measures have been imposed known as circuit breakers, which break trades when stocks are subject to major movements in either direction. The SEC also banned ‘naked access’, which enabled some traders to buy and sell stock without any pre-trade checks, and has made significant investments in technology to more closely monitor trading activity. The recent public outcry over HFT has not yet prompted the SEC to look at introducing specific rules, but it does plan to address the issue as part of a wider market structure review due in the coming years.
Additionally, the CFTC has begun consulting on the use of automation in trading derivatives, which could see new rules on algorithms introduced in the future.
Lee believes that, while lessons have been learned from the flash crash, the process of trying to prevent a similar event in the future is its infancy.
“Are we in a better position now than we were four years ago? Yes, in the sense that we know what went wrong. But we still don’t have any real solutions to the problems the flash crash highlighted.”