Data privacy: the challenges to artificial intelligence development
What is artificial intelligence?
The term artificial intelligence (AI), whilst commonly associated with autonomous vehicles and domestic robots, can broadly be defined as the development of computer systems that are able to perform tasks analogous to intelligent human behaviour. Driven by the advent of the Cloud and rapidly increasing volumes of digital data, AI developments have taken place in a number of areas, including machine learning: techniques by which computers learn by example and carry out pattern recognition tasks without being explicitly programmed to do so. In order to learn, machines need “big data”, often described as vast volumes of varied data arriving at high velocity. From a data privacy point of view, the biggest implication of AI is the use of big data; therefore properly safeguarding personal data has become increasingly important for businesses.
Impact of the General Data Protection Regulation (GDPR)
The aim of the GDPR, which will be directly applicable in all EU Member States from 25 May 2018, is to give individuals more control over, and the assurance of greater security for, their personal data. The importance of processing personal data fairly is preserved in the GDPR. Big data analytics can be characterised as a threat to privacy as it involves using complex algorithms and draws conclusions about individuals with sometimes unwelcome effects. A key question for organisations in this context is therefore whether the processing of personal data is fair. Fairness involves several elements, including the following:
1. Effects of the processing
An important concern is the potential bias induced in big data analytics. In some circumstances, even displaying different advertisements on the Internet can mean that the users of that service are being profiled in a way that perpetuates discrimination. The GDPR specifically provides that any person – the data subject – has the right “not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her” (Article 22(1) GDPR). The GDPR does not define “legal effect” or “similarly significant effect”.
However, the European data protection authorities - working together in the Article 29 Data Protection Working Party (WP 29) - have recently adopted Guidelines on Automated Decision-making, which are highly likely to impact AI-based services. According to the WP 29’s guidance, “a legal effect suggests a processing activity that has an impact on someone’s legal rights, such as the freedom to associate with others, vote in an election or take legal action”, and “for data processing to significantly affect someone, the effects of the processing must be more than trivial and must be sufficiently great or important to be worthy of attention”. Examples of automated decision-making include automatic refusal of an on-line credit application or e-recruiting practices without any human intervention.
Organisations therefore need to be aware of and factor in the effects of their processing on the individuals, communities and societal groups concerned. It is advisable for data controllers to use appropriate mathematical or statistical procedures for any profiling and take additional measures to prevent discrimination. Given the sometimes novel ways in which data is used in big data analytics, this may be less straightforward than in more conventional data-processing scenarios.
2. Transparency
The complexity of machine learning can make it extremely difficult for organisations to be transparent about the processing of personal data. There is an intrinsic difficulty in providing an explanation for an outcome when that outcome is based on an AI algorithm as the logic behind the machine reasoning may not be expressible in human terms. In some instances it may not even be apparent to individuals that their data is being collected (e.g. their mobile phone location). This lack of transparency can mean that businesses miss out on the competitive advantage that often comes from gaining consumer trust. The GDPR is therefore just one of a growing number of forces driving explainable AI. In theory, explainable AI should produce more explainable models, while maintaining a high level of prediction accuracy. Explanation is essential in order to ensure transparency and businesses should start considering it at the early design stages in AI product development.
3. Expectations
Organisations additionally need to consider whether the use of personal data in big data analytics is within people’s reasonable expectations. Deciding what is a reasonable expectation is inevitably linked to the issue of transparency. The view is often put forward that people are becoming increasingly less concerned about how their personal data is used. This is said to be particularly true of ‘digital natives’ – younger individuals who are happy to share personal information via social media. However, research suggests that this view can be too simplistic, given the complexities of AI and additional issues surrounding consent of processing of personal data.
Issue of consent
If an organisation is relying on an individual’s consent for processing their personal data, then that consent must be freely given, specific, and an informed indication that they agree to the processing. The GDPR provides that consent must also be “unambiguous” and a “clear affirmative action” such as ticking a box on a website. These requirements can particularly pose problems for Cloud-based voice assistants, for example. Will it be possible to ask for the consent of each individual present in a room before data is collected on what is being said?
Data privacy therefore poses a number of challenges to AI development. Undoubtedly, data protection awareness will become increasingly relevant and organisations should set up specific governance guidelines when dealing with AI, with such guidelines to address not only the overall technical and data inputting processes, but also a number of legal and ethical issues.
For more information, please contact Anjali on +44 (0)1483 252 576 or at Anjali.Chandarana@crsblaw.com.
Our thinking
Pei Li Kew
Pei Li Kew writes for Pharmacy Business on the link between pharmacy and IP
Pei Li Kew writes for Pharmacy Business on the link between pharmacy and IP
Mark Howard
Charles Russell Speechlys advises Acora on its acquisition of Secrutiny
Charles Russell Speechlys advises Acora on its acquisition of Secrutiny
Jonathan McDonald
Jonathan McDonald provides comment for City AM on the Data Reform Bill announced in the Queen's Speech
Jonathan McDonald provides comment for City AM on the Data Reform Bill announced in the Queen's Speech
Nick White
Charles Russell Speechlys advises Symphony Holdings Limited on the sale of its PONY trade mark portfolio for USD $28 million
Charles Russell Speechlys advises Symphony Holdings Limited on the sale of its PONY trade mark portfolio for USD $28 million.
Simon Ridpath
Simon Ridpath featured in the Lawyer’s Hot 100 list
Simon Ridpath features in The Lawyer’s Hot 100 list
Natalie Batra
Patents and Peppa Pig: What is happening to intellectual property rights in Russia?
Certain Russian individuals and businesses can now use patents, utility models and industrial designs without obtaining prior permission.
Simon Green
International Bar Association quote Simon Green on the future of the legal sector in Hong Kong
International Bar Association quote Simon Green on the future of Hong Kong's legal sector
Charlotte Duly
Charlotte Duly quoted in Retail Gazette on House of Zana trademark dispute
Charlotte Duly quoted in Retail Gazette on House of Zana trademark dispute
Keir Gordon
Charles Russell Speechlys celebrates this year’s Sports Technology Awards finalists
The Sports Technology Awards celebrates tech-led innovation in sports, globally.
Mark Hill
Mark Hill quoted in The Times on the Ed Sheeran High Court copyright case win
Mark Hill quoted in The Times on the Ed Sheeran High Court copyright case win
Caroline Greenwell
Nowhere to hide for greenwashing brands
In the UK, regulators are cracking down, with many companies now at risk of financial and other penalties.
Jamie Cartwright
Weighing up the Plastic Packaging Tax
The Plastic Packaging Tax came into force on 1 April 2022.
Jamie Cartwright
Crunching numbers - Mandatory calorie laws come into force
The Calorie Labelling (Out of Home Sector) (England) Regulations 2021 (the Regulations) are now in force.
Mark Hill
Mark Hill quoted in the Daily Mail discussing Ed Sheeran’s copyright court case win
Mark Hill quoted in the Daily Mail discussing Ed Sheeran’s copyright court case win
Jamie Cartwright
Jamie Cartwright comments on the potential impact of the plastic packaging tax
Jamie Cartwright comments on the potential impact of the plastic packaging tax
Jody MacDonald
Liverpool FC’s Hero Club and the current state of play with football NFTs
Liverpool’s Hero Club hit the headlines this week and serves as an interesting reflection of the current state of play.
Rachel Bell
Rachel Bell commented in IT Pro on the implications of the proposed EU’s Digital Markets Act
The proposed EU’s Digital Markets Act is set to require larger messaging platforms to interoperate with their smaller rivals.
Sonia Kenawy
Claimant ordered to pay security for costs in cryptocurrency dispute and digital assets rejected as form of security
Proceedings that are sure to be watched closely by the cryptocurrency community as well as legal practitioners.
Stewart Hey
Freezing Orders: Policing the Nuclear Option (PT 2)
Looking at the impact these checks and balances have when it comes to drafting and construing the terms of the order.
Stewart Hey
Freezing Orders: Policing the Nuclear Option
This article considered some of the checks and balances that apply when seeking access to one of the law’s most potent weapons.