The world’s stock and futures exchanges and other capital-market venues can benefit from using various forms of artificial intelligence, but need regulation that encourages responsible adoption rather than creates barriers.
That’s the message from the World Federation of Exchanges, which it says represents members from “250 pieces of market infrastructure” that facilitated $124 trillion of trading volumes in 2023.
The WFE weighed in as a response to a US Treasury consultation that it says is leaning too prescriptively. Traders and exchanges have been big users of artificial intelligence on top of electronic trading for decades. But the hype around large-language models for generative AI prompted regulators to look again at the role of AI in capital markets.
AI priorities
Exchanges around the world are looking at AI, including machine learning, to reduce compliance burdens, enhance risk management, and, with genAI, pre-fill trade orders.
But the number-one priority is to use AI for market surveillance, says James Auliffe, manager of regulatory affairs at WFE in London.
“If exchanges could conduct market surveillance more efficiently with AI, it would be huge,” he told DigFin. “Our members are afraid of manipulators who might use AI tools, and they’re asking whether the industry can counter these threats before they even materialize.”
First mover
Nasdaq, in keeping with its fintech roots, is launching such tools. It has one called Verafin that allows banks to use genAI agents in its anti-money laundering systems, which is meant to automate more of the process. It is also testing an AI copilot feature that is meant to cut down investigation times.
Nasdaq also launched an order-filling application powered by AI in June, which has been approved by the US Securities and Exchange Commission. Meant to improve fill rates, the feature is aimed at a particular order type for trading at midpoint over a fixed holding period during which market makers and trading desks source liquidity.
The AI makes these fixed periods looser, so the AI changes the length of holding periods during the trading day, based on the individual stock and its analysis of demand. The tool should help traders either fill orders more quickly or signal when to wait to attract bigger volumes.
Nasdaq, announcing the feature, called Dynamic M-ELO, said it took its data science and AI teams several years to tweak the product and relevant data points.
Heavy hand?
What Nasdaq does matters worldwide because many other exchanges and clearers use Nasdaq’s technology.
WFE’s Auliffe says members who use Nasdaq’s matching engine are keen to see some of these AI use cases be made into products they too can adopt. Other groups are developing their own AI, including use of LLMs.
The industry is worried that authorities may take too heavy a hand with AI. Richard Metcalfe, head of regulatory affairs at WFE, cited Basel rules that required banks to massively overcapitalize any crypto held on balance sheet, and other rules that have deterred adoption of blockchain-based systems.
“On the one hand, these have been examples of principles-based regulation, which we support,” he said. “But these are already highly regulated entities. The blockchain example shows that sometimes regulation can become too prescriptive. That leads to further problems because then the regulations must be constantly updated, because they’re trying to pin something down.”
WFE’s stance
Exchanges are calling for US and other jurisdictions to regulate AI in a way that encourages adoption. This includes by defining AI narrowly, as “computer systems with the ability to make decisions or predictions based on automated, statistical learning”, according to the WFE’s consultation.
In practice that means regulations should address AI-specific risk-management tools, so that firms can use AI to detect fraud and surveille markets. The rise of deepfakes and more sophisticated phishing attacks means the industry will have no choice but to fight fire with fire.
Not so scary
The industry group also argues that some perceived risks in AI are manageable. One commonly cited fear is ‘explainability’, that the AIs are black boxes. While this may be a big issue in retail markets, for capital market entities, the models are based on predefined rules and can be interrogated to ensure CTOs or other tech-related executives can understand what’s going on.
Another concern typical in the world of exchanges is the over-concentration of vendors. Nasdaq isn’t the only provider of matching engines but the shortlist is, well, short. Although LLM companies are also few in number because of the vast compute resources they require, the AI vendor world is otherwise diverse.
The barriers to adoption are internal: AI solutions cost a lot of money, and they require people who know how to use them. To this end, WFE says regulatory clarity is important because it will help exchanges decide which AI companies to onboard and who to hire.
WFE’s paper is in response to a June consultation by the US Treasury, which “is interested in perspectives on topics including potential obstacles for facilitating ‘responsible’ use of AI within financial institutions, the extent of impact on consumers, investors, financial institutions, businesses, regulators, end-users and any other entity impacted by financial institutions’ use of AI, and recommendations for enhancements to legislative, regulatory and supervisory frameworks applicable to AI in financial services.”