Bank customers ill-served by poorly deployed chatbots – UK Regulator

0
The consumer protection watchdog says working with customers to resolve a problem or answer a question is an essential function for financial institutions – and is the basis of relationship banking.

“To reduce costs, many financial institutions are integrating artificial intelligence technologies to steer people toward chatbots,” says CFPB Director Rohit Chopra. “A poorly deployed chatbot can lead to customer frustration, reduced trust, and even violations of the law.”

Among the top ten commercial banks in the country, all use chatbots of varying complexity to engage with customers. Much of the industry uses simple rule-based chatbots with either decision tree logic or databases of keywords or emojis that trigger preset, limited responses or route customers to Frequently Asked Questions (FAQs). Other institutions have built their own chatbots by training algorithms with real customer conversations and chat logs, like Capital One’s Eno and Bank of America’s Erica.

The CFPB says that when customers interact with their bank they can struggle to get the response that they need and instead can face repetitive loops of unhelpful jargon.

“Financial products and services can be complex, and the information being sought by people shopping for or using those products and services may not be easily retrievable or effectively reduced to an FAQ response,” states the agency. “Financial institutions should avoid using chatbots as their primary customer service delivery channel when it is reasonably clear that the chatbot is unable to meet customer needs.”

The CFPB says it is actively monitoring the market, and expects institutions using chatbots to do so in a manner consistent with their customer and legal obligations. It is also encouraging people who are experiencing issues getting answers to their questions due to a lack of human interaction to submit a formal consumer complaint.

LEAVE A REPLY

Please enter your comment!
Please enter your name here