Data Protection and Privacy: Is the consent model broken?
Are companies violating individuals’ human rights by relying on consent and legitimate interest to process personal data? On 3rd November 2019, the Joint Committee on Human Rights (“JCHR”) published a report on the impact company practices for processing personal data has on human rights and whether the current regulatory regime is sufficiently robust.
The Data Protection Act 2018 and General Data Protection Regulation (“GDPR”) were introduced eighteen months ago with a key aim being to further empower individuals with control of their personal data. The Human Rights Act 1998 protects, among other things, individuals’ right to respect for private and family life and freedom from discrimination. The JCHR considered whether the data protection legislation has fallen short of its goal and its implementation by private (primarily online) companies has not prevented them from increasingly encroaching on these fundamental human rights.
The use of the internet and online digital platforms has become central to people’s home and work lives . The rapid development of technology and the growth of companies offering free online services in exchange for data has created new business models where millions of individuals¹ personal data is collected and sold online. To do this, companies most commonly rely on ‘consent’ or ‘legitimate interest’ as a legal basis for processing the personal data. However, the JCHR concludes that the ‘consent model is broken’ and legitimate interest is not sufficiently understood, thus, fostering an online industry that is encroaching on individuals’ human rights.
The JCHR report concludes that the consent model unreasonably places the onus on individuals to educate themselves in order to understand the risks associated with sharing their personal data online. The complexity of privacy policies makes it almost impossible for individuals to understand what consent they are giving. Further, many businesses make use of their online services conditional on agreeing to their non-negotiable terms. Therefore, consent is often not ‘informed’ or ‘freely given’ under the GDPR.
Personal data can be processed where ‘it is necessary for the purpose of the legitimate interests’ pursued by a company. The data protection legislation describes broad areas where legitimate interest could be relied upon. The JCHR report observes that there is a general lack of understanding on what would constitute a legitimate interest and calls for clearer guidance and a rigorous process to test whether companies are using legitimate interests appropriately.
How does this affect human rights?
Despite data protection regulations, companies are routinely buying, selling and sharing people’s data without the individual’s true consent or knowledge, clearly infringing on their right to privacy.
The JCHR report details how data is now being shared and at times aggregated to create online profiles of individuals without their knowledge, a concern shared by a number of European privacy regulators including the UK Information Commissioner’s Office. These online profiles are often used for online targeted advertising. The algorithms used for targeted advertising draw inferences from a person’s online profile and makes a decision whether to show that person particular ads, for example for a job or a services. As a result, the individual, who has no way of knowing about their online profile or correcting any of its inaccuracies, are discriminated against with no access to certain services and opportunities due to their online demographic.
The JCHR conclusions
The JCHR has recommended (among other things) the following:
- Introduction of human rights impact assessments and due diligence to review how a company’s gathering and sharing of customer data may result in an adverse impact on the human rights of their users;
- Stronger enforcement of the data protection regulations, and a review as to whether further legislation is required to make companies take more responsibility for the safety of their users; and
- The introduction of a mechanism, similar to a subject access report, where individuals should have the right to request data that companies have generated about them, to understand any inferences that have been drawn, such as if they have been refused access to a service based on inferences taken from online aggregated data.
Why is this important?
The implementation of the data protection legislation is a changing landscape. On 8th April 2019, the Government published its Online Harms White Paper, which concluded that the legal framework provides adequate protection against the misuse of people’s data by internet companies. However, the JCHR are urging the government to re-consider this position and include the violation of people’s right to privacy and freedom from discrimination as part of the government’s list of ‘online harmful activity’ to consider. If you are a private company who utilises machine learning for targeted advertising, or relies on legitimate interest or consent as a legal basis for processing customer personal data, it may be worth taking the time to consider whether your practices could impact on those individuals’ human rights and whether to make any changes.
¹The Office for National Statistics in the UK stated that 87% of all adults used the internet daily or almost every day in 2019.
News & Insights
Focus Antitrust - 15 January 2020
The latest edition of our regular Focus Antitrust update.
All bets are off: ASA rules Betway YouTube video breached the CAP Code
ASA found Betway had breached the age restrictions contained in the CAP Code in relation to a video published on its YouTube account.
Smart contracts, blockchain and construction dispute resolution
David Savage writes an article for Practical Law Construction about smart contracts' impact on construction dispute resolution processes.