When the question of automation comes around, one principal challenge comes down to the inefficiencies that exist between FX and other asset classes, asserted Thomas Roberts, foreign exchange dealer at abrdn, speaking on a panel at TradeTech FX in Amsterdam.
“If we solve a problem for one in our data flow, how does that reflect on scalability of another? These things are very siloed and that creates inefficiencies between all of your asset classes. Now the next evolution is merging all together to make them more scalable as a whole rather than individually, especially if we go more multi-asset on the buy-side.”
Panellists were quick to point out that the answer is not to have a surplus of systems to bolt-on to, but rather focus on being structurally sound by going back to the base factors, particularly important in the face of market disruption.
Oskar Wantola, head of execution technology at Man Group, explained: “We try to build the best solution for structural challenges and we want to have a stop loss solution for events so we always look at a combination of both.
“So we have the structural challenges– solving for inefficiency, handling the data, building the trading solution with smart routing solution – and then on top of that, we need to make sure that as you automate trading, if there is an event, you can stop it from losing your money.”
Read more: Shift to T+1 set to intensify the need for automation in FX
“During a market event, you expect volatility, you expect more data, you need your system to hold up in that period – you can’t have it go down, you have to start thinking about your liquidity […],” said Eugene Markman, chief operating officer (FX) at ION Markets.
“So, when you’re building a system, you have to think forward, to the future. Future proof it and start thinking about not only would it function on a day-to-day basis, but how will it also function during an event like a flash crash.”
When it comes to the specifics around manual supervision and oversight versus automation, the panel agreed that internally there’s a lot of conversations taking place as to where responsibility falls and essentially this encompasses various teams – thus leading into the ever-feared ambiguity factor.
Roberts explained that figuring this out is a “huge” aspect for his firm from the machine learning side.
“From a dealing side of things, if I haven’t built the tool and then the tool is doing things for me, is it my fault if it goes wrong? Is it compliance? Is it the fund manager? You can teach [machine learning] to repeat patterns and learn from yourself, so that’s straightforward, but then […] who is essentially going to get called by the client once it goes wrong.”
The panel agreed that priority-wise, figuring out the element of risk management in the automation game comes first, before beginning to run down that road.
Bart Joris, head of FX sell-side trading at LSEG, added that from the exchange’s perspective building controls makes up a large part of what they do: “We try to automate the ancient workflow and to do that we obviously need to have automated controls that will trigger stop trading – trigger kills. Building controls is part of our research process which has quantitative traders involved.”
“[…] We have generic controls across asset classes and specific controls. We work with our partners on that. It’s very much collaboration between trading researchers, technology and compliance within the company.”
When asked which of these parties does receive the call in the event of something going wrong, Joris confirmed ultimately it is the head of trading risk.
He added: “One of the big things which I always say to everybody is that an AI model will not go to jail, the CEO will go to jail.”
Read more: Workflow automation is non-binary, say experts
Addressing the current state of play when it came to the challenges and inefficiencies, Wantola asserted that “the definition of the inefficient market is a lack of information and delay reaction to markets,” and that this should be front of mind.
“From a technology point if view, in order to increase your chances of having the correct information and correct prices, you would ideally increase the number of LP’s that you trade with [and] you would like to build your solutions that are fast, low latency.”
Steve Totten, managing director, head of institutional and quantitative products at oneZero concurred, asserting that while vendor solutions can help, it is ultimately down to making the increase in data actionable for the buy-side – real and empirical usability.
“What I do next? How does that help me choose when to execute? The next generation of analytics is very much around providing practical information.”
In conjunction with this comes the ever-relevant interoperability point – essential to bear in mind when it comes to addressing specific inefficiencies.
“We have a lot of systems that are available in the market and there’s a combination of clients who either build or buy – sometimes build and buy – and the interoperability between them is sometimes lacking. The data flow isn’t necessarily how you want it to be, as well as collecting a tremendous amount of data so your storage costs go up – how do you manage that to get useful benefits out of it,” said Markman.