Regulators told to focus on abusers, not algos

The US Financial Industry Regulatory Authority has confirmed a widespread investigation of algorithms, saying they are “out of control”, but there are doubts about whether curbs on technology are likely to be effective.

The US Financial Industry Regulatory Authority (FINRA) has confirmed a widespread investigation of algorithms, saying they are “out of control”, but there are doubts about whether curbs on technology are likely to be effective.

Speaking at the North American Financial Information Summit in New York last week, FINRA’s chief information officer, Steve Randich, said the regulator was currently investigating 170 different algorithms for abuse, inadequate supervision and other technology-related issues.

“This is technology just getting out of control – software bugs, poor controls and operational monitoring deficiencies where we’ve actually seen real issues,” he said, citing the 2010 flash crash and the rogue market making algorithm that led to losses of US$440 million at Knight Capital in 2012.

FINRA’s investigations into the operational risks of algorithmic trading arise at the same time as greater regulatory scrutiny and wider concerns over the impact of high-frequency trading (HFT). US regulators have also proposed requiring algorithms to be registered and approved as part of Regulation Systems Compliance and Integrity (RegSCI), the Securities and Exchange Commission’s (SEC) attempt to bolster the systems that underpin the US securities markets. But industry experts doubt whether this is an efficient use of sparse regulatory resources.

“Rules requiring algorithms to be registered (RegSCI) are a waste,” said Matt Samelson, principal at consultancy Woodbine Associates.

“It would be a huge exercise and regulators simply don’t have the resources needed, nor to they have the skillset in-house to learn anything useful from these complicated algorithms.”

One broker, who did not wish to be named, also questioned the ability of both FINRA and the SEC to properly police algorithms.

“When we look at RegSCI, there are some unworkable proposals, such as having to re-register for every incremental change made to an algorithm, which we do on a regular basis and each change would require lengthy documentation. It would be a huge burden on the SEC.”

He said such a regime could result in brokers instead choosing to forgo regular algo updates and make fewer, but more substantial changes, which would increase the potential risk of algorithms.

“If you make a lot of changes all at once, there’s a much greater chance that you end up with some unintended consequences,” the broker said.

Samelson said FINRA and the SEC should focus on more fundamental issues to improve stability, prevent market abuse and curb HFT activity.

“Issues such as exchanges giving HFT clients preferential data feeds is a much more fundamental issue affecting the market than checking algorithms,” he explained.

“Instead, the SEC should focus on knowing who market participants are, ensuring orders are tagged enabling them to trace any market manipulation or risks to individual users. This is a much practical way to deal with these issues.”

He argued that algorithms themselves are rarely the problem, and that it is the users of algos that can cause them to be used for market abuse or in a risky manner.

Despite these doubts, US regulators are under increasing pressure to control HFT – and the automation of trading in general – in the wake of revelations made in ‘Flash Boys’, a book by author Michael Lewis, which suggested HFT “rigged” the market against end-investors. The SEC is investigating some HFT firms, those with strategies focused on latency arbitrage or flash orders, while FINRA said it is looking at a number of “threat scenarios” related to HFT to identify potential risks to the financial system.

«