The faltering start to the Facebook IPO suggests that exchanges and market participants are reaching a technology dead end, forcing them to make a difficult choice, according to Terry Keene, president and CEO at enterprise technology vendor iSys.
On 18 May, the launch of Facebook's IPO on US exchange Nasdaq OMX was delayed by around 30 minutes because of a technology fault. A massive surge in volume caused by the IPO overwhelmed the exchange’s system, resulting in some brokers being disconnected from the exchange during the auction process and a failure in trade confirmations. The glitch left many buy-side firms uncertain of their exact exposure or whether their trades had been executed at the opening price for a number of hours.
Keene believes the incident should be taken as a warning. The current Intel technology used by the majority of exchanges and major investment firms is reaching the end of its capabilities, due to inbuilt structural limitations, he said. As connections between markets become ever more crowded, the potential for bottlenecks, delays and crashes becomes ever greater.
“If you are an exchange, it is hard to prepare for more volume,” he told theTRADEnews.com. “The only way you can add more capacity is by adding more dual-chip boxes - but every time you do that, it adds latency. Trading systems are becoming increasingly complex and unstable, capacity is more and more strained, and it can't be fixed without adding massively to latency.”
Big data has become an increasingly prominent focus for exchanges and their buy- and sell-side participants in recent years. With rising levels of market fragmentation, decreased order sizes and ever-decreasing latency, the levels of data their trading infrastructure has had to handle has increased exponentially.
Rival systems, such as IBM’s Power, may provide part of the answer, considered Keene. That is the system used by Watson, a supercomputer that defeated the reigning champion in US game show Jeopardy last summer. The IBM system is built around multiple-threading, enabling it to process instructions far more simply and quickly.
However, to gain from this technology, exchanges would have to make considerable changes to their internal processes, since trading engines would need to be connected to the appropriate network of compliance, market data and monitoring systems. Trading members would also have to re-write their algorithms along with much of their trading architecture – a long and painful process that would require considerable investment. Yet Keene insisted it would be worthwhile.
“Once they make the investment, we are talking about processing that is several orders of magnitude more powerful,” he said. “So not just twice as fast, or even ten times as fast, more like 100 times or more. The writing is on the wall for the traditional trading systems.”