Institutions across the buy- and sell-side are expressing frustration with the soaring cost of data – and when nearly every participant is complaining about something, it might need to be looked at.
Fixed income data spend for sell-side institutions has increased by half in the last five years according to a recent report by AFME, with market data spend more generally rising by a quarter. Spend on data on exchanges has also risen by 42% since 2017. These costs are all felt keenly across the street.
Data today is more essential than it has ever been for trading and as the market evolves it is set to play an ever bigger role in how institutions operate. New reporting and research requirements brought in under Mifid II have increased the quantity and the complexity of data required to execute, while fragmented markets mean participants are often forced to use several sources, all at a cost. This has been exacerbated by the growing need for data to comply with environmental, social and governance (ESG) regulations and to offer dual services in the UK and Europe post-Brexit.
“Data fees are increasing along with the granularity, depth and the speed of the information that the trading venues provide because that’s more valuable to participants,” said Hayley McDowell, European equity electronic sales trader and market structure consultant at RBC Capital Markets. “It’s no secret that commissions are declining and profit margins are being squeezed across the board, so the cost of doing business is increasingly under the spotlight.”
AFME’s report suggests that soaring data expenses have been driven by a 35% increase on the existing cost base and a 15% increase on new incremental data usage. For the fixed income markets, automation and the expansion of instruments available have added to this increase.
“We are a buy-side company focused on quantitative, systematic strategies, so data has a particular importance for us,” said Tobias Stein, managing director and head of portfolio implementation at Quoniam Asset Management. “In fixed income, we have a growing universe in terms of assets which results in higher data costs. As a quantitative asset manager, we are in a position to systematically explore these larger universes but this is obviously a driver of additional costs.”
In light of growing data costs, many buy- and sell-side institutions are becoming increasingly disgruntled over the fact that venues and the vendors that aggregate their data do not have to put up any risk for the transactions that they are subsequently profiting from, instead simply repackaging and then selling that data at a high price.
The greatest fixed income cost increase recorded by the sell-side in AFME’s report was for reference and price data, which has increased by a third in the last five years driven by data demand, changes to vendor licensing agreements and Mifid II obligations.
While Mifid II was designed to reduce data costs, the case can be made that it has done the opposite, particularly in the fixed income markets. The regulation brought with it swathes of new requirements and data complexities for firms to adhere to, including and pre- and post-trade reporting requirements to ensure best execution, which have impacted the buy-side heavily.
Under Mifid II, there are 15 Approved Publication Arrangement (APA) facilities appointed in Europe to manage and publish firms’ reporting data, while MTFs retain responsibility for trades executed on their platforms. However as AFME’s report highlights, not all APA data is available via all data vendors, meaning firms often have to enrich this data by purchasing it from multiple other sources, thereby duplicating costs. In addition, these sources are seldom standardised, meaning significant additional work can be required before the data is of any value.
“Another driver [of costs] was the change in policy including external research etc.,” added Stein. “I think cost focus in terms of the consequences of regulation, especially for the buy-side, is something which would be key in our perspective for future regulatory changes.”
Supply and demand
Data is now the lifeblood for all markets and those who supply it are only too aware of its importance. Across the markets, the venues that host order flow hold a monopoly over pricing and depth of book data, which institutional investors cannot live without. Institutions can find themselves on the back foot if they are only accessing data on a secondary basis as opposed to straight from the source, and because there are a lack of alternatives, it will never be as competitive as it should be and venues will always hold negotiating power.
“The most relevant data originates from the venues because that’s where the inventories are, that’s where the RFQs are flying back and forth and that’s where the trades are happening. That obviously puts them in a good position when it comes to data quality and potentially also charging for that data quality,” said Mikael Björkman, head of fixed income execution and OTC structured products at Credit Suisse.
The lack of competition in the market in terms of execution platforms and exchanges providing market data across asset classes is something that regulators globally have begun to assess. The UK’s Financial Conduct Authority (FCA) is currently underway with a year-long investigation examining if a lack of competition is driving trading costs up. Regulatory requirements for benchmarking a trade with an independent source, indices and credit ratings have also driven up costs for market data and this is another area the FCA has chosen to examine, exploring whether high costs could be limiting the number of new entrants in the market.
In a bid to alleviate cost concerns from participants around market data provided by exchanges in the US, regulators last year proposed to increase the level of data publicly available via real-time consolidated data feeds for US equities – otherwise known as SIPs – to include depth of book data that has historically been exclusively provided by exchanges. The move landed the Securities and Exchanges Commission (SEC) in a lawsuit after incumbent exchanges Nasdaq, the New York Stock Exchange (NYSE) and Cboe went so far as to take the watchdog to court in February over the proposed extension. Legal proceedings are still underway.
“More competition would certainly help address the monopoly venues have,” said one individual, who wished to remain anonymous. “However, this is easier said than done. In the fixed income markets three incumbent trading platforms are responsible for a large amount of flow and a new contender would probably struggle to make inroads.”
AFME’s report references three unnamed MTFs in fixed income, all of which have seen more than a 50% increase on spend on their pricing data. We all know who we’re referring to here.
As exchanges and venues hold a monopoly over the data that is accumulated on their platforms, they also control the way that their products are priced and how much transparency they want to disclose around how they reach that figure. In listed securities markets, where participants rely more heavily on exchanges, this is more prevalent.
“The data we receive is often raw so we have to work on it once we receive it. Technology and manpower are needed to analyse it and make it usable,” said one anonymous individual. “Data products are also bundled with other products so it’s difficult for market participants to know exactly how they are being priced.”
What’s more, technology has advanced so significantly in the last five-year period that participants are finding it difficult to understand how rising market data costs can be justified by exchanges and venues who claim that their costs are also going up.
“It seems that the market data fees have increased while costs for exchanges may have declined – for example, the likely cost of dissemination due to improvements in technology in recent years,” explained McDowell. “Market data is not regulated and the fees for it are often bespoke according to different clients.”
The recently-launched Members Exchange (MEMX) moved to introduce blanket fees for its real-time equities market data in February this year, in a bid to offer a “significantly less expensive” service in comparison with other exchanges. Founded in 2019 by BofA Securities, Charles Schwab Corporation, Citadel Securities, E-Trade, Fidelity Investments, Morgan Stanley, TD Ameritrade, UBS and Virtu Financial, MEMX is designed as a contender to incumbent exchanges NYSE, Nasdaq and Cboe. The exchange plans to publish its fee schedule to offer transparency and allow users to compare costs with other providers.
“It is difficult to separate costs associated with trading and data and this means data is subsequently priced according to its value. Data’s value is increasing and therefore costs are increasing also,” said an anonymous individual. “Unbundling the cost of trading from the cost of data will probably not make any different to the price of data but it may offer comfort to participants.”
In its report, AFME has suggested that standardisation of pricing models for purchasing data from all vendors should be implemented alongside uniform formats that data is stored and provided to firms and consistent data access procedures to remedy the issue.
Director-general of the Federation of European Securities Exchanges (FESE) Rainer Riess, however, believes exchanges are often scapegoated in the market data debate surrounding listed securities, instead arguing that they are the most regulated and open about their pricing while the buy- and sell-side are not required to meet the same standard.
A report by FESE and Oxera found that exchanges’ market data revenues had only risen 1% between 2012 and 2018. However, when regulators put forward plans for a real-time post-trade consolidated tape in Europe in November, the Federation was quick to condemn the proposals and the impact that they would have on their revenue. The same report found that in 2019, aggregated market data revenues on exchanges were just ¤245 million, representing 0.003% of total assets under management, as a proportion of market capitalisation.
“It’s only exchanges that are so far regulated but interestingly the debate is always coming back to them. It’s a 245 million Euro market, which in European terms is not big. The bigger fish is everything around it,” he said. “Business is changing and there is a higher cost with it. Exchanges are one area that is under very stringent regulation on a reasonable commercial basis and there’s been a lot of scrutiny of the exchange cost. When ESMA for example did its survey and the Commission asked us to, we provided revenue figures from our members. Interestingly enough the sell- and buy-side have never done an equal exercise.”
Instead, he attributes rising costs in market data – particularly in Europe – to the fragmentation of the markets and complexities of aggregating data. “The complexity of markets has significantly increased. Ten or 20 years ago you had your five or six dealers and you picked up the phone and talked to them. There wasn’t much about data. Today you have 600 venues across Europe and you’re getting all kinds of bits and pieces of information that you’re trying to piece together.”
Aggregation and direct connectivity
Buy-side data spend is predominantly dedicated to pricing and reference data – an area where, as seen previously, costs have risen the most out of any data type – alongside terminal data and ratings, AFME’s report found. It is because of this that buy-side institutions, particularly in fixed income, have begun exploring in-house aggregation solutions and direct connectivity with the sell-side. These are either developed using proprietary technology or off the shelf products.
Credit Suisse, for example, aggregates data in-house across multiple venues, and this, according to Björkman, puts them in a favourable position by reducing their reliance on expensive venues and vendors for data. Venues have a different incentive for providing data then vendors, as they are trying to make their markets look attractive to us, he explains.
“It is the power of the aggregation that’s happening from the buy-side perspective. That means that you cannot overcharge for data, at least not for the buy-side. We are a pretty large buy-side house with a significant flow which the venues can then benefit from,” he said. “We aggregate across multiple venues and the interest for them to show us data is therefore greater. It’s up to them to announce that liquidity and thereby provide us with data that supports the decision to go there.”
However, this is not a be-all and end-all solution. Credit Suisse relies heavily on proprietary technology and spend to aggregate data and this is not a luxury that all buy-side institutions are able to enjoy.
“It is obviously a favourable position to be connected to multiple venues and to be able to aggregate and base a decision on that. It’s still a fairly small number of the buy-side houses that have the means and infrastructure to aggregate data,” Björkman added.
Off-the-shelf execution management systems from the likes of FactSet, FlexTrade, TORA etc. also offer buy-side firms aggregation capabilities. The use of these EMS or proprietary aggregation technologies to piece together fragmented sources of liquidity can reduce participant’s reliance on data offered by venues as they can be used to identify a price while the trade still takes place on the venue.
However, adoption of these is not yet at a critical mass and potential future changes to regulation could make these systems just as costly for participants. Regulators set out plans in January to potentially change the scope of what is classed as a venue under MiFID II regulation, including order and execution management systems, that could leave buy-side firms that have integrated said systems forced to pay additional costs that could counterbalance any savings made on data.
“There are EMSs that facilitate this aggregation. It means that from a functional perspective you will have similar ability to aggregate data regardless of your infrastructure and size. The market tracking reference prices on the venues are still going to be the premium. That doesn’t mean that you will be totally dependent on them,” explained Björkman. “I don’t know how it looks in smaller private banks. In Europe, we tend to be further progressed than, for instance APAC and even the US, and that could also mean that you won’t see this development for a few more years until people catch up. EMSs capable of aggregating data are not on every desktop yet.”
As suggested in AFME’s report, a consolidated tape could go some way to reducing market complexities, however, one will not resolve the issue of rising data costs alone. For fixed income in particular, where most products are traded over the counter due to the sometimes infrequent, non-standardised and illiquid nature of contracts, the specifics of a tape must be mapped out in detail before any significant impact is felt.
“We all know roughly where the liquid stuff trades but it very quickly becomes far less liquid and that means the density of the data and quality of the data is going to be far less,” said Björkman. “That’s not going to change in fixed income. Even if you get a reference price from a trade that happened yesterday on the consolidated tape can I use that data? What’s the value? The precision?”
If left unaddressed, the high price of data is only set to get higher and could leave some firms at a competitive disadvantage, forcing them to cut services or even withdraw from some markets all together. Regulators are finally beginning to recognise that there is a real issue here, and the result of the various investigations currently underway could potentially result in some welcome changes for concerned market participants.
In the meantime, the buy- and sell-side banding together through various initiatives could see a reduction on the market’s reliance on expensive venues and vendors. As they say, there is strength in numbers. However, whether any of these schemes can gain the critical mass needed to make an impact may be another story.