Chris Middleton looks at some of the challenges posed by AI in a world of financial innovation and constant regulatory disruption.

With RegTech and FinTech booming, artificial intelligence (AI) has a major role to play within both, and in helping the financial services sector to navigate oceans of data in a complex and ever-changing regulatory world.

But is AI – a technology still in its infancy in enterprise application terms – really the global problem solver that some claim it to be? With some solutions little more than black boxes, creating auditing problems and a lack of transparency, some argue that AI itself challenges longstanding legal principles of liability and accountability.

In a Westminster eForum AI policy conference in London just before Christmas, co-chair, the Right Hon Lord Reid of Cardowan – former Home Secretary, an experienced Cabinet minister, and, among other things Honorary Professor and Executive Chair of the Institute for Strategy, Resilience and Security (ISRS) at University College London – suggested that the financial and insurance sectors could be especially problematic as AI deployments rise.

“If you think the problems are bad now, it’s going to be a nightmare for insurance,” he said. “This is my view, but insurance ultimately depends on legal liabilities and it’s always struck me as peculiar that in all the major infrastructure in Britain – whether it’s engineering, building, construction, financial companies, pharmaceuticals or whatever – there is an audit process.

“That doesn’t make any of them perfect… but it does impose a feeling of responsibility, because ultimately the audit process [behind decisions and the decision-making process] is there. But there is no auditable process for software, there are no overall regulations, there’s no legislation, you can’t ask who put that line of code in, when did they put it in, why did they put it in – in other words, ask who is responsible.

“To me, this is a real problem and it will get worse with AI. Because not only do you not have the coding audit for a normal piece of software, but once you get to black box AI, when inside that black box it is actually learning, to me it creates a real challenge for insurance and I think that’s an area that we ought to look at.”

Financial services is in the vanguard of enterprise AI deployments worldwide, and this is changing the core nature of the industry, according to at least one major report. Last year, the World Economic Forum (WEF) published a wide-ranging analysis on how financial services will be both disrupted and challenged by AI.

The report, The New Physics of Financial Services: Understanding how artificial intelligence is transforming the financial ecosystem suggested that the bonds that have historically held financial institutions together are actually weakening, not strengthening, as a result of new technologies.

The operating models of financial institutions are being reshaped, making them more specialised, leaner, more highly networked – and more dependent on the capabilities of technology players. In other ways, the same technologies are actively removing competitive differentiators, said the WEF.

In all of this creative destruction, there is real opportunity. Yet market regulators will be constantly challenged, said the WEF. Data regulations will have transformative impacts on the shape and structure of financial markets, particularly where they require increased data portability.

“Unlocking the full potential of AI will require financial institutions and their regulators to co-create new approaches and solutions,” said the report.

As some processes are shifted to shared utilities, institutions will seek to offload accountability to those central utilities as well, while regulators will push to hold the original institutions accountable. This will create new tensions within the market.

Efficient compliance will become a commodity. As institutions pool compliance services, they will find themselves on the same competitive plane, removing yet another differentiator between players.

Yet in a market where every institution is vying for data diversity, managing partnerships with competitors will be critical, if fraught with strategic and operational risks.

“Regulations governing the privacy and portability of data will shape the relative ability of financial and non-financial institutions to deploy AI, thus becoming as important as traditional regulations to the competitive positioning of firms,” said the report.

Global data regulations are themselves undergoing a period of unprecedented change. For example, the EU’s revised Payment Services Directive (PSD2) came into force a year ago, with the aim of enabling more innovative payments across Europe.

In conjunction with the General Data Protection Regulation (GDPR), this means institutions have to balance requirements to share data with third parties against the risk of substantial penalties in any cases where data is lost, hacked, or mishandled.

The UK – which has adopted GDPR under the Data Protection Act – was one of the first jurisdictions to adopt open banking a year ago. While China does not have a comparable system, existing regulations have already been conducive to FinTech and RegTech companies, according to the WEF.

In other parts of the world, governments are considering radical changes to their data regimes. Australia, Singapore, Canada, and Iran, among others, are actively considering different forms of open banking, mirroring the steps taken by the EU and UK. In this regard, the US remains an anomaly. There, data-sharing alliances are more ad hoc, with banks building bilateral relationships with data aggregators.

However, Congress has been listening to testimony from technology companies on the topics of data privacy and security. That said, many such companies oppose the data privacy regulations introduced recently in California (CCPA), which some suggest could become a de facto US standard.

The rules come into force in the state – home of Silicon Valley – in 2020, and several advertising- and data-driven tech companies, such as Google and Facebook, are attempting to water them down before then. The likely outcome is a significantly weaker federal solution, rather than national adoption of CCPA.

Either way, the evolution of data regulations worldwide will be the critical driver in determining the roles of different players in financial services.

Growing cyber-risks present further operational challenges. Institutions will need to develop strategies to mitigate the increasing risk of abuse or leakage of confidential data at both customer and transaction level, as well as the increased risk-sharing of sensitive, competitive information.

Yet something not mentioned in the report is that if AI can automate fraud detection – a big plus for the technology in the WEF’s estimation – then, logically, it may also automate fraud itself, or make it harder to detect.

In conclusion, then, as AI takes an increasingly critical role in the day-to-day operations of the financial system, it poses a new source of systemic, as well as ethical, risk, while challenging longstanding legal concepts, such as liability (in situations where no human, other than a coder, was directly involved in making a decision).

“Without proper oversight, AI innovation could introduce new systemic risks into the financial system and increase the threat of contagion,” warned the WEF. “AI is likely to have a transformative effect on the global financial system, so the task of the ecosystem will be to maximise the benefits, while mitigating the harms.”

Something to consider while rushing towards AI-enabled tools, including within RegTech itself.

Be part of a discussion and connect with like-minded leaders in your sector at our exclusive event series on banking and RegTech.