Despite a spate of exchange system trading malfunctions, regulators must be wary of overreach when finalising Regulation Systems Compliance and Integrity (Reg SCI), two US bulge bracket execution heads have said.
Since the 2010 flash crash, the botched Facebook IPO and an outage of a Nasdaq-run securities information processor (SIP) that stalled trading for three hours last year, the Securities and Exchange Commission (SEC) has made clear it will put in place strong measures to boost market resilience.
But, the scope and definitions in the current proposal of Reg SCI unfairly affect smaller participants and brokers operating dark pools, according to two directors of execution.
Speaking at the FIX Americas Trading conference in New York last week, Adam Inzirillo, director, global execution services for Bank of America Merrill Lynch, said the SEC had shown intent to craft workable regulation, but that Reg SCI required wholesale changes to achieve its stated goals.
“Regulators need to take a closer look at Reg SCI and how it impacts the brokers and exchanges,” he said, adding that requiring alternative trading systems (ATSs) with low levels of market share to meet the same standard as market leaders would be unfair.
“The big concern is how does this impact on brokers. If you look specifically at ATSs, ensuring they have proper testing makes sense but the rule itself is vague and it impacts the overall broker [rather than the ATS itself] as it is currently written,” he said.
Lack of clarity
A proposed requirement to inform the regulator of system errors was also unclear, he said.
“Every broker is required to file with the SEC every time there is a ‘material event’, but this hasn’t been defined by the SEC,” he said, warning that this had caused alarm on the sell-side as the scope of this definition could seriously affect day-to-day activity.
Speaking on the same panel, Dmitry Bulkin, director, advanced execution services for Credit Suisse, agreed that clearly defining the scope of the rules and which participants were affected would be critical to crafting effective regulation.
“There seems to be a ‘one size fits all’ approach and there doesn’t seem to be a distinction in the risk profiles of entities that will qualify as SCI entities,” Bulkin said.
“If a small ATS or dark pool has a glitch, it’s not going to have the same impact on stability of the entire market as a SIP [outage] for instance. That’s something the regulators should carefully study and modify in the proposal,” he said.
The 377-page Reg SCI proposal, released in March 2013, was intended to increase market stability through more stringent procedures and testing for systems directly supporting trading, order routing, market data, clearing, settlement and surveillance. It included 230 questions for industry participants, responses to which will contribute to the final rules.
In November, the SEC told theTRADEnews.com it was still assessing input on industry submissions for the Reg SCI proposal, but did not outline what outcomes were expected this year. There are presently no set timelines or dates by which the regulator must act on these submissions.
Competitive drivers
Another approach to reducing the number and severity of system outages, suggested fellow panellist Rishi Nangalia, CEO of trading technology provider REDI Technologies, would be to publish the number and type of errors on a public utility.
This, Nangalia urged, would generate a downward trend in system malfunctions as firms compete on the number of glitches, reducing the likelihood of events such as Knight Capital’s 2012 algo glitch that lost the company US$440 million.
He said Reg SCI was “reactive” in nature and a more beneficial approach would be to create a ‘scorecard’ whereby firms are judged on a commercial basis for the soundness of their systems and testing procedures.
“What happened to Knight could happen to anybody,” Nangalia said. “People have to remind themselves that when they build something, they need to maintain it and if they don’t then they have to pay a price.”