Why regulators must open up their data to the City

Opening up regulatory data to the public sounds radical, doesn’t it? But there are strong precedents that show it’s technically feasible, and the public interest, to open up datasets that provide transparency on how key markets work.

Since 2012, HM Land Registry has provided open access to the UK’s database of land titles and property sales, for example. Such transparency sits at the heart of digital assets built on public blockchains; with a few exceptions, anyone can open up a blockchain network explorer and start analysing the flow of tokens from wallet to wallet. A rich ecosystem of analytical tools has been developed to provide real-time insights into the dynamics of specific markets — loan pricing, staking arbitrage opportunities, liquidity and more.

In traditional financial services, the data collected by regulators can be grouped into three buckets: firm data; product data; and traded market data. Firm data relates to the organisation, governance and prudential standards — the capital, liquidity, and risk profile —  of a financial institution. Product data, as you might expect, covers the products —  mortgages, investments, bank accounts — sold by a firm. Traded market data reflects orders and transactions in the likes of shares, bonds and derivatives.

READ Crypto regulation must address ‘gaps’ in US oversight, says SEC chair Gensler

Opening up this data — through public application programming interfaces —  would enable the so-called ‘regtech’ industry to develop a suite of new analytical tools, and would deliver two key benefits. First, the regulator would gain access to the crowdsourced wisdom of the industry, as interested parties analyse the data and identify new insights, for example, spotting fraud and market abuse, or assessing emerging risks in markets and/or specific firms.

Second, regtech vendors would be able to tune their products with this data so as to better help financial institutions to comply with their regulatory obligations, significantly reducing this cost burden. Furthermore, by embracing external data analysts, regulators have the potential to gain exposure to new techniques for data analysis and visualisation, enabling supervisory bodies to accelerate their digitalisation agendas without a commensurate increase in cost —  a clear win for post-covid cash-strapped authorities and in-line with the Financial Conduct Authority’s objective to become a data-driven regulator.

There’s plenty of evidence that a culture of openness can help increase trust in institutions and markets, supporting the FCA’s objective to enhance market integrity. Of course, there are a number of practical implementation issues to consider. For example, under GDPR, it would be illegal to reveal personally identifiable information in a public dataset. Financial institutions might be concerned that some of the information held by regulators is commercially sensitive. Perhaps, most critically, some regulatory information relates to active civil and criminal investigations, and revealing this information might tip-off criminals.

READ Calls mount for real-time data tape in the bond markets

However, these concerns can each be addressed pragmatically without taking away from the main objective of allowing the private sector, and the public at large, transparency into the workings of the financial sector. As digital assets enter the mainstream, the underlying infrastructure of financial services is evolving in a way which makes open data the default mode.

With the FCA and Bank of England in the early stages of their new data strategy, now is the ideal time for the UK to establish its leadership as the most open, transparent regulatory system in the world.

James Nicholls is managing director at Braithwate

Most Related Links :
todayuknews Governmental News Finance News

Source link

Back to top button