Fixing foreign exchange: Addressing legacy technology

The battle with legacy technology systems is ongoing concern for many banking institutions, but arguably no more so than foreign exchange. With an explosion of market data and increasing market fragmentation, technology strategies at large FX institutions have never been more important for staying ahead of the game.

The bigger the bank, the bigger the problem. It’s a common perception, and complaint, when it comes to trading systems and technology in an increasingly complex foreign exchange (FX) landscape. Now, more than ever, FX institutions must process greater volumes of data and connect to more venues than ever before, allat phenomenal speed to avoid high-frequency traders or being left behind other market players.

In this context, technology is undoubtedly a game-changer.It’s no secret that some large institutions continue to rely on numerous systems which were implemented 25 years ago.But as the trading game evolves under new regulation, technologies and changes to the market structure, banks are rapidly finding out that those legacy systems simply can’t hack it anymore. For those firms, the task ahead is truly monumental.

Around five years ago, FX leaned towards a relatively small number of venues when markets were volatile, but now volume scatters across a larger range of more than 70 FX trading venues. This means banks are tasked with connecting to those venues and price streams, as seemingly infinite amounts of data are sent to systems which, at many large institutions, are either considered to be legacy or simply outdated.

Alongside this,fragmentation in FX is a significant issue to contend with. Banks typically connect to between 40 and 70 data and price streams, combining bank-to-bank streams and venues depending on the institution,although in core FX markets there is significantly more concentration with around 10 major venues including EBS and Reuters. The rise of FinTech vendorsand new ideas in the FX space means it is in fact possible to trade with everyone in the market, but trading with everyone can in some cases result in sub-optimal execution, with high rejections and often frustrated liquidity providers. 

“Typically, the larger the buy-side firm or bank, the more connections they have, which means more data to manage. The larger institutions have numerous legacy systems internally, which often work on legacy technology, not always the newest and most robust,” says Steve Toland, co-founder of trading connectivity firm, TransFICC.

“In the past, many banks tried building everything in-house, effectively fuelling a technology arms race. Many are stuck in this old way of thinking, but now commoditised data services and application programming interface (APIs) can be externalised to a utility.”

Modular approach

Technology at large sell-side firms needs to work hard and fast to keep pace with the FX market. In recent years, when the now deemed controversial last look method and use of speed bumps was largely popular, the risk of publishing stale prices or receiving high reject rates was a little easier to control. But now the latency-sensitive nature of the market means the use of legacy systems at large FX institutions is becoming more problematic.Those systems cannot cope with the amount of data and therefore cannot provide useful information based on the data, which is crucial for big players in FX trading.

“When it comes to legacy systems in FX, they can be a hindrance especially due to the increasing need for big data and analytics,” explains Mpumi Makhubu, manager of eFX products at Standard Bank. “It’s difficult for legacy systems to hold that type of data and provide accurate information. Transaction cost analysis (TCA) and best execution are particularly difficult to leverage when using legacy systems. In most cases legacy systems are also manually operated, which means there is a far greater risk as they are prone to human error.”

Banks considered to be ‘technologically advanced’ are responding to market data challenges by updating legacy data servers, analytics and storage hardware, but when it comes to technology, the sell-side has to make a decision whether to seek help from industry vendors or confront the challenge in-house. The “build versus buy” debate around technology continues to rage across the entire financial services industry, with industry participants seeking the right balance between both methods when it comes to legacy systems.

Richard de Roos, head of FX at Standard Bank, believes his ‘modular’ approach to updating legacy systems has paid off. He explains it can be difficult for large sell-side firms to buy off-the-shelf, new systems as they could very well be considered legacy within six months.

“What we try to do is have legacy systems in place that are newer than other systems.It’s not proprietary, it’s from a vendor that provides regular updates on an aggregated basis to ensure we are best of breed in most cases,” Roos explains.“On top of that, we are very aware that we do not know where the technology evolution is heading. We need the agility to make sure we aren’t backed into a corner and buy a system that becomes ‘old-school’ legacy in six months’ time.

“With that approach we are switching the costs in terms of moving from one vendor to another on a modular basis, rather than approaching one vendor with all of our system requirements. That has really paid off for the FX business at Standard Bank.”

Roos adds that systems are regularly and rigorously checked as part of a review process approximately every six months to ensure the FX business at Standard Bank has sufficient agility in the way it is set up to make changes if necessary. The trick for large FX institutions, he concludes, is making everything modular and API applicable, without breaking stability – but this has its own risks. 

“The other side of the argumentis that if you are too modular and you have the need for low latency, it can have the opposite affect and cause instability,” Roos says.“It’s difficult and there are risks, but banks need to evolve and remain wise about the decisions they make rather than being impressionable to stay ahead of the game.”

Other banks have taken a slightly different approach to legacy system upgrades. In February last year, for example, Crédit Agricole Corporate & Investment Bank (CIB) decided to replace its legacy systems for FX trading through a partnership with Orchestrade Financial Systems. All front-to-middle office processing of vanilla and structured products from two legacy platforms were overhauled and migrated to Orchestrade’s platform.

Global head of the trading division at Crédit Agricole CIB, Thomas Spitz, said at the time that the overhaul meant the business was able to improve risk performance, keep up with new regulatory changes and reduce costs. Post implementation, Orchestrade added that the bank has seen improved efficiency alongside one consistent platform used by the sales, trading, risk and operations teams.

Connectivity and liquidity

Achieving low latency connectivity to various global FX venues and the data that those connections accumulate remains one of the biggest challenges for large FX firms. Alina Karpichenko, global marketing manager at IT infrastructure provider Avelacom, says the vendor works with a number of institutional trading firms and is seeing an increased demand for direct access to multiple trading venues.

“It could be exchanges, ECNs, bank and non-bank liquidity providers with matching servers located across multiple datacentres around the world. What makes it even more complicated is that FX trading is primarily latency sensitive, that’s why reliable and low latency connectivity to multiple venues globally is a key challenge,” she says.

Global FX marketsface shrinking bank balance sheetsand an increasing amount of venues with varying trading protocols,meaning institutions must processliterally tens of thousands of pieces of data every second. Liquidity conditions are obviously challenging in this environment and tight spreads in FX require fast execution. The speed of accessing and processing market data, and quick order routing are fast becoming key differentiators for large banks butfinding the right balance for both connectivity and liquidity can lead to complications.

“The optimal use of connectivity and liquidity is a complex problem,”says Alexei Jiltsov, co-founder of data science firm Tradefeedr.“It requires the trader to collect the quote and order data, and quantify market impact and how selection affects liquidity providers in the stack. A trade-off between adverse selection and spread compression gives the optimal composition of liquidity providers. It is also unique to each trader’s flow.”

“Obviously, a major challenge is to capture and analyse every data point in a fast processing environment. Just as connectivity was solved by specialised technology players, market data management and intelligence has a high fixed cost which can be more optimally managed by specialised players. Big institutions are starting to use common infrastructures to solve their specific problems just like in the case of connectivity.”

Looking ahead, it’s a safe bet that the volume of data, connections and venues will only increase.For institutions in FX, technology and trading systems will play asubstantial role as to who winds up as the winners and losers in this space. Large banks have big decisions to make in terms of technology and systems, and those decisions must be made alongside considerable risk. But, as in the case of Standard Bank, taking that risk has certainly paid off for some. 

«