Reporting and Governance

The move towards the digitisation of financial products raises important questions around customer data, fairness and in dealing with vulnerable customers requiring financial firms to tread a careful path. By Suchitra Nair, director at Deloitte’s EMEA Centre for Regulatory Strategy and Morgane Fouché, assistant manager in Deloitte’s EMEA Centre for Regulatory Strategy. 

The shift in focus from human-enabled to digitally-enabled customer journeys takes financial services (FS) firms into some nuanced regulatory and technological hotspots, which require careful consideration and navigation. 

Such hotspots arise for numerous reasons. In a digital world, the human link to test the customer’s understanding of the risks is less present than in traditional interactions. The risk is heightened of customers buying FS products rapidly without understanding the related risks. 

Additionally, in a more complex data-sharing ecosystem, the relationship between the various participants may be less immediately obvious, and their respective regulatory responsibilities more difficult to identify and communicate clearly to customers.

In our recent publication, ‘The Retail Customer Digital Journey’, we explored some of the regulatory hotspots arising from a customer journey facilitated primarily through digital channels and incorporating new technologies. As part of our research, we conducted an online YouGov survey of more than 1900 UK smartphone owners through a series of specific questions on their perceptions around risks and digitally-enabled financial advice. We summarise below some of our key findings. 

Transparency and communication

Once consumers share their data, many are likely to expect that financial products are tailored to their needs. This could increase the risk of lack of clarity around the scope of the market scanned and the basis on which products are offered. Transparency around these elements and others is crucial in maintaining trust and ensuring consumers fully understand the nature of the product or service they are buying. 

Two findings from our research further highlight the need for transparency. Open banking APIs allow third parties to access banking data with customer consent. However, our survey revealed that many customers still do not fully understand the scope and nature of banking data being shared: more than 75% of surveyed adults did not know the extent of banking data accessible by a third-party app through open banking APIs when consent was given. 

Our survey also unearthed a high degree of reluctance to share banking data with third-party apps and insurers: 83% of smartphone users did not have a budget management app, one of the main reasons being their reluctance to share their financial data. Interestingly, 79% of smartphone users were unwilling to share their banking data with an insurance company in exchange for tailored insurance products or discounts.

Enhanced transparency is even more important in a world where financial decisions are increasingly AI-powered. EU data protection rules require explicit consent from individuals to use their personal data to perform automated decision-making or profiling activities. Beyond regulatory compliance, communication about how firms use AI and customer data is key to building and maintaining customer trust around the technology, which remains low. When asked whether they would be more likely to trust financial advice provided by AI or by a human, 75% of survey participants said they trusted humans more or were unsure. 

Ethics, fairness and vulnerability

The shift towards digitally-enabled FS also raises some nuanced conduct and ethical issues, particularly around potential unfair treatment of customers.

The risks of price discrimination and bias against particular customer segments are enhanced when complex AI and/or machine learning models are trained on inherently biased data. Therefore, explainability of AI-driven outcomes is key to assessing whether decision drivers are in line with the firm’s risk appetite and ethical standards.

Digitisation also raises financial exclusion concerns for customers who do not have access to, or are unable to navigate, digital channels, particularly those who are vulnerable. AI-powered digital solutions can help in identifying such vulnerability and providing timely support. However, our survey highlighted that support needs to be handled carefully, especially when data is shared across multiple parties: 34%, 29% and 25% of surveyed adults indicated that they would be worried, annoyed or surprised, respectively, if they interacted with digital channels and a human agent contacted them directly to discuss financial support. 

IT governance and controls

The iterative and evolving nature of AI-powered digital channels also creates nuanced hotspots around governance and controls. 

Continuous monitoring, testing and updating of the training data and algorithms are key to deploying AI solutions at scale and in line with the firms’ risk appetite and standards. 

For cloud-hosted solutions, outsourcing governance and controls need to be robust from the time when the contract is initially entered into, and throughout the course of the contract. 

In an open banking environment where customer data is received from third parties, controls around the accuracy, integrity and completeness of data is crucial to providing the right product at a particular point in time. 

Finding the balance

Transitioning to a successful digital customer experience is a balancing act between managing the nuanced regulatory hotspots and reducing friction in the customer interaction. Developing the right governance, controls and communication will be key to striking this balance. 

All figures, unless otherwise stated, are from YouGov. Total sample size was 1928 UK smartphone owners. Fieldwork was undertaken from 25-26 March, 2019. The survey was carried out online. The figures have been weighted and are representative of all UK adults (aged 18+).

This article is free to read, request a no obligation trial access to Global Risk Regulator.