Keeping it real-time

Faster processing of data has opened a Pandora’s box of opportunities for market participants. It gives traders greater abilities to arm algos while avoiding harmful market activity and meet increasingly stringent client and regulatory demands.

Is real-time processing causing the data revolution it promised? 

Faster processing of data is a cross-industry trend gaining traction as businesses exploit technology to make decisions on complex events in real time. But for trading, the stakes have never been higher.

For electronic trading, being able to process huge amounts of info in short periods of time from a range of sources has lead to major changes in the trading landscape.

While complex event processing (CEP) and real-time transaction cost analysis (TCA) have become industry standards, this level of timely automation of trading strategies isn’t the only aim of high-speed number crunching.

For stat-arb firms, the millisecond may be king, but for other market participants, recent advances in technology have borne a wider range of practical changes. Data that would have taken days to compile may now take hours, letting those in control make business decisions faster.

The aggregation of risk measures is a good example. Previously, risk systems operating in silos would have been reported to individual business intelligence tools the following day and someone would then normally aggregate the data into a common view for further analysis. The creation of a company-wide view of a particular risk measure could have taken days, sometimes weeks. Now, that process needs to be done on the same day, if not in real time.

Will this push for real-time data see algos create too much complexity in the market? 

As algo trading develops, a range of signals are used to feed these complex equations to stay ahead of the curve and avoid negative shifts in the market while chasing alpha.

As the web of relationships between participants becomes more tangled, the chance for high-impact, high-speed market changes has also developed, with recent examples showing how high the stakes are.

Not so long ago, algos were based on historical data stretching back years. Now, the dominant amount of data fuelling algos stretches back only weeks, from a far greater number of inputs.

Despite the wider array of data sources, in this low-volume environment, the reduction in normal trading means volatile events have a stronger impact on the market. Major events, like the Knight Capital algo glitch of August, make real-time data even more important for electronic trading, as firms need to ensure their own algos work appropriately and insulate themselves from harmful activity.

Real-time TCA data has also become industry standard for algos, as well as buy-side traders, to help decide where, when and what to trade, and to help avoid negative market events.

Will increased speed in data processing cause greater transparency in line with regulations or simply tighter reporting deadlines? 

Being able to handle high-volume, high-velocity data and extracting the business value from it will also increase as both regulators and clients push for increased transparency.

Trading firms will need to ensure they can use real-time data to establish risk and show evidence of trading decision-making.

It won’t just be regulators demanding information on over-the-counter derivatives trades enshrined in the European market infrastructure regulation and US Dodd-Frank Act, but internal risk analysts and clients, hungry for improved performance.

Quite simply, gathering data in a more timely manner will help you meet post-trade reporting, and having systems in place before competitors will become a point of competitive advantage. In one instance, optimal calculations of client risk positions would result in taking on more clients.

Real time may be old news, but staying ahead of the curve has never been more important.

«