Skip to content

Insights

06 January 2020

Chatbots – Trends, Predictions and Five Key Legal Issues

There was a global increase in the use of chatbot technology in 2019.  The market is growing rapidly, as solutions become more sophisticated and capitalise on developments in artificial intelligence and natural language processing.  We take a look below at the trends we’re seeing and consider some legal issues associated with the use of chatbots.  

What do businesses use chatbots for, and why?

Use cases for chatbots (i.e. conversational software that interacts with people, through text or voice) are expanding as solutions powered by artificial intelligence become more flexible in “learning” what users like and get better at understanding accurately what user communications are intended to mean. As a result, the global chatbot market is predicted to grow from $2.9 billion in 2019 to $9.4 billion by 2024 (Market and Markets). 

Chatbots are predominantly used as part of customer service functions but are also deployed elsewhere, such as internal HR and IT helpdesks. The technology lends itself well to consumer sectors, such as retail, with chatbots being the most common use of artificial intelligence across that sector. Numerous retailers have reported successful use of chatbots; for example, Lidl’s “winebot”, Margot, guides customers through wine choices via Facebook Messenger and Levi’s offers a virtualised stylist chatbot.

The technology drove $7.3 billion of global retail sales in 2019, according to Juniper Research. That number is predicted to increase to $112 billion by 2023, with associated cost savings of $439 million, due to the significant reduction in human resources required to operate equivalent functions.

However, cost savings are not the only benefit. Chatbots offer huge potential to improve the consumer journey and extend the omnichannel retail experience, and Juniper warns that retailers who do not adopt the technology will face strong challenges from “technologically-adept disruptors”. Retailers will be able to learn from a consumer’s past choices and customise their experience by building a profile based on previous interactions and data from other channels, such as social media. The data retrieved can then be more easily and quickly recorded, stored and analysed, with retailers developing insight from millions of transactions and interactions.

Legal issues

More often than not, a business will need to rely on external providers to develop, supply, integrate and update its chatbot solution. The terms on which the supplier is engaged will therefore need to be considered. The supplier contract is likely to take the form of a software development, integration and licence agreement, but it is important to tailor it to the context.

Here are five key legal issues to consider when commissioning your chatbot solution.

1. Continuous Evolution - Consider where you want to be in two years’ time. Chatbot technology is evolving fast and you might therefore want the flexibility to change your provider, if another has developed a product that better suits the evolving requirements of your business.

You might decide to use a provider on a trial basis, not only to learn about that provider’s solution but to learn about how the concept works best for your business and how your customers react to, and interact with, the solution.

When contracting with a third party, contract durations are therefore likely to be short. For anything other than the shortest of contracts, however, the solution should be treated like a product or ongoing service rather than a one-off project, as the technology is likely to evolve quickly. Unless it covers a short trial only, this should be reflected in your contract, which is likely to need to reflect agile development methodology.

Scaling up after a trial is also likely to be complex, given the number of touchpoints between a complex chatbot and the other systems used by your business. Consider how best to protect your business along the way. Can usable software, for example, be provided at intervals and, ultimately, used without your provider to ensure that your business receives some value from a project that hits stumbling blocks?

2. IP Ownership - For the same reason, consider whether you should own the solution that is provided, so that it can be developed further in the future on your behalf, or whether a future provider would deploy its own solution instead. Where the chatbot uses machine learning, who should own any developments based on your data? You are likely to want to prevent your competitors benefiting from algorithms developed based on your data. Although, sometimes you might accept that the solution has been developed based on the data of others and that shared benefits are part of the deal with a particular provider.

3. Subcontractors - Many chatbot solutions rely on standard third party tools, especially when using artificial intelligence. IBM’s Watson or Google’s Dialogflow, for example, may power the solution. Whilst these tools are useful, one drawback is that, as they are standardised products, the providers tend to insist on using their own standard terms. If a separate third party developer is providing your chatbot solution, you should ensure you are clear about who is actually providing the service, what third parties (such as IBM or Google) can be used and who you are contracting with, so that proper due diligence can be conducted. Be aware that your solution provider may be limited in what it can agree if it is bound by the standard terms of its contractors.

4. Misbehaving chatbots - Chatbots that employ machine learning tools have been known to misbehave. Microsoft’s Tay started life as an innocent bot, designed to mimic the language patterns of a 19 year old American girl and to interact with young Twitter users. Within 24 hours, Tay (learning from her interactions) had become a sexist, racist troll and was promptly shut down by Microsoft. Even Amazon’s Alexa (a virtual assistant) has been known to go off the rails, last year advising a user to “kill your foster parents”.
It’s important to establish who is responsible if your chatbot responds in a way that might cause upset. Should your provider take responsibility or, if the technology employs machine learning, is it your responsibility as provider of the data that feeds the development of the output? In any event, it’s important that the solution is tested robustly and that there is a level of built-in oversight and, ultimately, a failsafe to prevent any harmful activity escalating.

5. Data protection – It’s highly likely that you will be processing personal data, or that your provider will be processing personal data on your behalf, as part of your chatbot solution. It therefore goes without saying that, in the context of the UK (and Europe), you will need to consider GDPR compliance issues. For customer service solutions, consider the lawful grounds for processing and how to balance GDPR compliance with a seamless customer experience. If relying on customer consent to process personal data, how will that consent be procured? Would it be less disruptive to your customer’s experience, and still compliant, to rely on “legitimate interests” as a lawful grounds for processing?

With respect to fair processing information and transparency, steps should be taken to draw the customer’s attention to the relevant processing (e.g. by stating that a call is recorded). UK ICO guidance also suggests that the consumer should be informed of at least the fundamental information, such as the basic purpose of processing, the identity of the data controller, and the rights of the individual. Further information should be made available at the optional request of the consumer.

Chatbot deployments are complex and do not always succeed but, with careful planning, and the right contractual terms in place, the benefits are clear, and we expect to see further growth in deployment and development throughout 2020.


For more information please contact Chris Ingram on +44  (0)20 7438 2135 or at chris.ingram@crsblaw.com.

TOP