Algorithmic trading: Smarter than ever?

With growing client expectations and a constantly developing market landscape, Wesley Bray explores the evolution of algorithmic trading, delving into its use cases, the importance of data and trader intuition and how algo strategies are utilised during periods of high volatility.

In the dynamic realm of the financial markets, the introduction of technology has proven to be a catalyst for transformative change, overhauling existing trading strategies. Among the wide range of advancements, algorithmic trading has revolutionised how financial instruments are bought and sold.

As markets become increasingly complex and interconnected, the need for speed, precision, and automation has become paramount. From the early days when algorithms were basic rule-based systems executing predefined strategies, to the present era of machine learning and artificial intelligence-driven models, the evolution of algorithmic trading is central to the adaptability of financial markets. 

“Algorithmic development has and always will evolve to achieve the best performance possible versus the client benchmark,” says Alex Harman, head of EMEA electronic and program trading at Goldman Sachs. “That would involve minimising footprint via enhancing order placement and internalisation, having extensive liquidity capture via the SOR and a framework of customisable algorithms built upon a fast, scalable algorithmic platform.”

Simplification and automation have been key focus areas on the buy-side. The goal is to enable high touch traders to adapt dynamically to market conditions while still ensuring that algo strategies remain simple so that those strategies can be correctly measured and compared. 

“On our systematic side, we have adaptive algorithms that are identified in our EMS and routed to multiple venues to achieve the optimal outcome,” notes Samuel Henderson, EMEA equities head trader at Invesco.

“This adaptive automation allows us to manage hundreds of orders quickly and efficiently at below pre-trade costs and most importantly without adverse selection. As we expand our database of measurable historic trades, our machine learning insights continue to enhance the decision making of our algos.”

Evolving client demand has driven innovation in algo trading. More traditional strategies, such as VWAP for example, have begun to incorporate machine learning and predictive techniques to remain relevant. Increasingly, clients are looking for more advanced methods of liquidity seeking, in particular in harder-to-trade stocks during liquidity events, such as the close and monthly expiries.

“Trading is always a trade-off between price impacts and opportunity risks,” says Ben Springett, head of electronic and program trading, EMEA, at Jefferies. “The longer I take, the more opportunity/risk I’m exposed to. We see a migration of strategies toward higher urgency liquidity seeking; we see people moving away from VWAP; and we see less people willing to wait for the closing auction.”

During periods of high volatility, some quant funds as well as funds which typically use long duration or schedule-based strategies such as VWAP or TWAP, will see a shift in urgency to go into more arrival like benchmarks such as liquidity seeking algos. 

“Clients that continue using schedule algos tend to shorten the order duration and in addition, they look to customise participation in the closing auction,” notes Harman. 

Volatility

During periods of increased volatility, algo strategies come to the forefront even though their usage isn’t necessarily changed entirely. Instead, it becomes a by-product of a change in objective of the buy-side trader. A greater sense of immediacy becomes apparent for traders in these periods, resulting in a shift in algo strategies.

Typically, as volatility increases, liquidity decreases, resulting in an increase in impact. In such periods, it can be observed that traders move from automated algo trading to high touch and portfolio trading, relying on more blocks. 

“As market volatility increases, we find clients tend to specify more algo parameters on an order level i.e., ‘offsets to benchmark’, ‘would levels’ and ‘smart scaling’,” notes Chris McConville, global head of execution services and trading at Kepler Cheuvreux Execution Services (KCx). “We also see an increase in customised algo usage. In more recent situations where market volatility has increased, we saw an increase in demand for agency blocks.”

Information leakage

When utilising algorithms, information leakage becomes paramount, especially when breaking up orders and dealing with multiple banks. Various techniques exist to help combat the issue, including splitting larger parent orders into smaller child orders to disguise the full intent of a trade, to both the market and a single broker. 

“Trading electronically can prevent word of mouth leakage – but similarly information leakage can be created by using the wrong algorithm or venue in the wrong way,” emphasises Invesco’s Henderson.  

Unpredictability also takes precedent. Firms need to be unpredictable when they respond to price changes, unpredictable in terms of size that they’re submitting to markets, while also maintaining an unpredictable stance in terms of their presence in the market. 

Reducing the predictability of the algo order placement – the child orders in the market – can also help reduce information leakage. In essence, preventing information leakage in algorithmic trading hinges on the intelligent design and execution of algorithms and SORs, notes McConville.

“Strategies like randomising order sizes, managing market entry times, utilising multiple and non-displayed trading venues, and deploying conditional orders are vital,” he says. “These approaches not only protect a trader’s strategy but also enhance the efficacy of their trades in a complex, multi-bank environment.”

Limitations of algo trading

As with any technological advancement, algorithmic trading has its limitations. Although these have narrowed, these strategies can still be improved. One of the biggest limitations is the lack of understanding related to the context behind the orders that are being placed. 

“Even the most sophisticated algorithm cannot know that the portfolio manager has been waiting three days to find liquidity, a potential catalyst is approaching, or that a bullish research note was published earlier that day,” highlights Phil Risley, head of trading and product development at Redburn Atlantic. “The algo will take the statistically correct approach, and adjust for a range of real-time signals, but that may not necessarily be optimal for that stock, for that PM, on that day.”

Algorithms are not one size fits all and another limitation to these strategies is that their appropriateness varies. Just because the tool was the most appropriate one day, does not mean it will necessarily be best tool on the next. Traders must consistently assess the usefulness of algo strategies and amend them appropriately to ensure they provide the best outcome. 

“There may be a change in market conditions, such that an algorithmic strategy was perfect yesterday, when there’s a significant amount of midpoint liquidity, there’s periodic auction, there’s conditional blocks and so on, and today that might not be the case,” notes Jefferies’ Springett. 

“Having an awareness of the real-time conditions that you’re trading into; helps you navigate the limitations of algorithmic trading.”

Another key limitation is liquidity. There is no magical way to create liquidity and algorithms may not be appropriate in every scenario. 

“Pre-trade metrics can give an idea of estimated cost and liquidity – which should help the trader choose the best way to start an order, be it via an algorithm, a high-touch desk, directly using broker capital on risk or any combination of channels,” explains Henderson. 

Most algos are based on a schedule and that schedule can be interrupted by events. This means that if volume is out of character, the algo must guess, which can ultimately lead to negative results. 

“Any algo is bound by the parameters the trader sets, unless the algo is customised. This is where understanding your algo tools is key. And, if you use more than one algo provider, you must make sure you know the differences between them,” stresses BNY Mellon Pershing’s equity trading desk manager, Matt Short.

Trader intuition

As with any technological advancement, the trader’s role shifts as it looks to adapt and improve workflow. Trader intuition is crucial, given that traders can see past historical data and utilise lived experiences to make the best decisions in unusual scenarios, which algorithms may not be able to detect.  

“Buy-side traders have an awareness of the stocks they’re trading, whether or not they’re sectorised, or arranged by portfolio management group, or whatever it might be – they have that underlying experience where they’ve seen a range of different conditions,” notes Jefferies’ Springett. “They’ve identified what can be successful and unsuccessful in those different conditions. And almost on a second nature basis, know what the right tool is for the job at any given point in time.”

Advancements such as artificial intelligence have proven to be beneficial to traders, however, these have been viewed as aids as opposed to replacements of the human trader. The same can be said for algorithmic trading strategies.

The buy-side have better internal tooling via their EMS or data provided by counterparty banks to help ensure they know the best time to use a certain strategy. Thanks to the continued evolution of electronic and algorithmic trading, buy-side traders are now inundated with growing data sets and pre-trade analytics to help determine what to do with a specific order. 

KCx’s McConville highlights that “while the efficiency and analytical prowess of algorithms are undeniable, the role of trader intuition in selecting the right algo remains indispensable.”

A benefit of algorithmic trading is its lack of human bias. Humans inherently have bias, be it conscious or unconscious– something that algorithms can avoid, making them more useful in certain trading scenarios. 

“If you are seeking to identify genuine differences that exist between different things, then having an automated process of managing the distribution of orders across those is critical. It’s impossible for a human being to remove all of their bias from any process,” adds Springett. 

Customisation

Central to much market debate in recent years as algorithms have developed is how much a firm should customise their strategies. Customisation comes with pros and cons, depending on what the algo is being tailored for. 

“When considering algo wheels, it is rare that an out-of-the-box strategy is going to be a perfect fit, so true customisations – specifically designed to take account of both the benchmark and the characteristics of the order flow – are more common,” notes Redburn Atlantic’s Risley. 

Although providing many benefits, customisation can also bring about more complexity, which could lead to increased risk of unintended consequences. To avoid these, robust tests, testing capabilities, QA testing, and change and release procedures are required to ensure that customisation does not impose unintentional consequences. The client’s desired customisation might also achieve different outcomes across brokers.

“While we try to build algos that work really well out of the box, we have several clients who each have different needs and requirements,” notes Goldman Sachs’ Harman. “What one client needs from VWAP or liquidity seeking algos doesn’t necessarily match what the next client will want in terms of performance, venues or urgency.”

Customisation can also increase cost and come with added pressures, including ensuring staff – especially new joiners – are trained up on each change to the algorithm. The buy-side have been vocal at industry events about the danger customisation poses to delaying updates to algorithms. When a new version of an algorithm is released, those firms who have customised it are often left until last to upgrade. 

“It is important to stay disciplined when developing [customised algo] solutions, in terms of documentation, increased testing, or even simply ensuring your client understands what the custom actually does in real life,” emphasises KCx’ McConville. “One thing is for certain, being efficient with customised solutions means you really need to understand agility, to avoid a drag on your resources.”

Central to agility is data. Like with anything linked to automation, data plays a crucial role in ensuring the success of any advancement. Algorithms require reliable sources of data around venue performance, smart order routing, liquidity profiles or opportunity costs to ensure they are beneficial to traders.

That data needs to be updated frequently to ensure the effectiveness of the solution, given that real-time data forms a key part of algorithmic behaviour. 

“It is important to understand that the execution landscape is continually changing and that historical data may not reflect today’s reality, but even more crucial is the recognition that the goal of any form of analysis is to develop insights allowing you to improve results,” notes Risley.

Looking forward

Although having its limitations and with areas of growth still existing, algorithmic trading continues to show promise as a trading strategy to help prioritise time and shift attention to more pressing orders. 

Algorithms are getting smarter and buy-side desks are equipped with more data and analytics to help with algo strategies alongside their tool kits becoming more sophisticated. Although limitations do exist within algo strategies, it appears as though these are narrowing. Algorithms are smarter than ever, but there’s still more work to be done.

The TRADE has actively been tracking developments in algorithmic trading over the past 17 years, by carrying out its annual Algorithmic Trading Survey. First launched in 2008, the survey now receives over 1,500 provider ratings from traders across the globe. To share your views please participate in the survey here.

 

«