SEC proposes new rules to tackle the misuse of artificial intelligence in investment processes

New rules are focused on tackling the way predictive data and similar technologies can allow firms to place their interests ahead of those of investors.

Yesterday, the Securities and Exchange Commission (SEC) has proposed new rules requiring broker-dealers and investment advisors to address conflicts of interests related to the use of predictive data analytics.

The new rules have a specific focus on the way predictive data and similar technologies are used to interact with investors, in an effort to prevent firms from placing their interests ahead of those of investors.

“We live in an historic, transformational age with regard to predictive data analytics, and the use of artificial intelligence,” said Gary Gensler, chair of the SEC.

“Today’s predictive data analytics models provide an increasing ability to make predictions about each of us as individuals. This raises possibilities that conflicts may arise to the extent that advisers or brokers are optimising to place their interests ahead of their investors’ interests.”

In a statement, the SEC highlighted that the use of certain technologies can lead to investors suffering financial harm in a more significant and broader scale than previously possible.

The proposed rules would require firms to evaluate and determine whether certain technologies will be able to be used to place their own interests above those of investors, while also requiring firms to eliminate or neutralise the effect of any such conflicts.

Firms will have to employ tools to address these risks that are specific to the particular technology used, consistent with the proposal.

Elsewhere, the proposed rules would require firms to have written policies and procedures in place to achieve compliance with the proposed rules and to adopt and maintain books and records related to these requirements.

“When offering advice or recommendations, firms are obligated to eliminate or otherwise address any conflicts of interest and not put their own interests ahead of their investors’ interests,” added Gensler.

“I believe that, if adopted, these rules would help protect investors from conflicts of interest — and require that, regardless of the technology used, firms meet their obligations not to place their own interests ahead of investors’ interests.”

The proposals come days after Gensler’s address at the National Press Club, in which he highlighted a number of concerns relating  to the use of artificial intelligence.

Among the concerns are the potential systemic risks which could be posed from too many firms becoming reliant on AI models that are constructed in a similar manner. With this, in the advent of volatility, AI models are likely to correlate in their movement.

Elsewhere, Gensler noted the importance of teams being able to explain and understand the technology they are using, as well as the need to acknowledge misinformation in some AI models.

«