Understanding the Data (Use and Access) Act 2025: Implications for UK Businesses
On 19 June 2025, after much “ping-pong” between the House of Commons and the House of Lords, the Data (Use and Access) Act (“the Act”) finally received Royal Assent. It is the current government’s version of the former government’s Data Protection and Digital Information Bill ("DPDI Bill"), which lapsed prior to the last general election.
In terms of data protection related changes, the key point to note is that the Act is not completely overhauling data protection in the UK. Whilst organisations may need to make minor adjustments to their processes and documentation, on the whole the Act is seen as an attempt to reduce red-tape and provide greater certainty for businesses. The Act amends, but does not replace, the UK General Data Protection Regulation (UK GDPR), the Data Protection Act 2018 (DPA) and the Privacy and Electronic Communications Regulations (PECR).
The Act covers more than just amendments to the UK data protection regime. It introduces provisions to:
- govern digital verification services;
- facilitate the sharing of customer and business data in a range of sectors including utilities, transportation, and real estate; and
- create a new public register of all underground infrastructure across England, Wales and Northern Ireland.
The most controversial aspects of the ping pong debate centred on Artificial Intelligence (“AI”) and copyright, with the House of Lords introducing amendments addressing the use of copyright materials to train AI models and transparency requirements. These amendments were eventually dropped, but the UK government has committed to publishing an economic impact assessment on the AI and copyright proposals found in its Copyright and AI Consultation Paper within nine months of Royal Assent. The Act entrenches this commitment by mandating the Secretary of State (“SoS”) to publish and lay this assessment, as well as a report on the use of copyright works in the development of AI systems, before Parliament within nine months.
Rebecca Steer, Partner in the Commercial team, expanded on the future of AI and copyright regulation in the UK following the enactment of the Act in her article dated 13 June 2025 here.
What are the key changes brought about by the Act in relation to Data Protection?
Data Subject Complaints
The Act introduces a new privacy right in the form of a right to complain. This allows data subjects to complain to data controllers regarding their UK GDPR compliance more generally, in line with existing guidance by the Information Commissioner’s Office (“ICO”) on this topic. The Act requires data controllers to take steps to help people who want to make complaints, such as providing an electronic complaints form. Complaints need to be acknowledged within 30 days and responded to ‘without undue delay’.
In addition, individuals will now be required to submit their data protection related complaints to organisations directly in the first instance. This means that complaints can only be escalated to the ICO where the complaint has not been dealt with adequately, or where the individual is dissatisfied with an organisation’s response. This is intended to ensure that the ICO focuses on complaints of greater significance whilst also giving organisations the opportunity to address and resolve complaints first.
The ICO will be publishing updated guidance on the complaints procedure for organisations in winter 2025/2026, which should clarify best practices for handling data protection complaints.
Data Subject Access Requests (“DSARs”)
Under the Act, individuals are only entitled to personal data that organisations can provide following a “reasonable and proportionate” search. The Act also states that where organisations seek clarification on the scope of a DSAR, the response time is paused. These changes codify existing ICO guidance and should give organisations comfort on the scope of the searches they are required to make in response to a DSAR.
Scientific Research
The Act aims to redefine scientific research and modify the consent requirements to simplify the process of reusing personal data initially gathered for specific research projects. It clarifies that commercial research, privately funded research, and any research that can reasonably be described as scientific, falls within the scientific research exemption under Article 89(2) of the UK GDPR (which disapplies certain data subject rights in certain circumstances). The Act also amends consent requirements. This means that organisations can obtain broad consent for broad purposes which is designed to help address situations where it is not possible to fully identify the data processing purposes at the time personal data is collected.
Legitimate Interests
The Act retains the concept of a “recognised legitimate interest” from the DPDI Bill whereby organisations relying on this basis will not have to carry out a balancing exercise in certain “recognised” situations. The list of recognised legitimate interests includes sharing data in relation to national security, emergency response and safeguarding vulnerable people. There is also a list of activities where legitimate interests may be relied upon including intra-group data sharing for administrative purposes, direct marketing and processing to ensure network and information security, but those will still require legitimate interest assessments to be carried out. The SoS can omit, add to or vary these lists, if they are necessary to safeguard a public objective in Article 23(1)(c) to (j) of the UK GDPR (e.g. public security, the protection of judicial independence and judicial proceedings and the protection of the data subject or the rights and freedoms of others).
The ICO will be publishing further guidance on these changes in winter 2025/2026.
Automated Decision Making
The Act replaces Article 22 of the UK GDPR which relates to solely automated decision-making (“ADM”) (i.e. decisions made by automated means with no meaningful human involvement) where there are “significant legal effects” (e.g. an online decision to give a loan, or a recruitment test with pre-set algorithms and criteria).
Under the UK GDPR, ADM is restricted to three conditions where: (i) it is necessary for the performance or entering into a contract between an organisation and individual; (ii) it is authorised by law; or (iii) the individual has explicitly consented. The Act seeks to relax this by allowing an organisation to make solely automated decisions in a wider range of situations. This means that ADM will generally be allowed, subject to certain safeguarding measures. For example, organisations will need to provide information to individuals about the decisions being taken using ADM, their right to contest those decisions and to seek human intervention. Restrictions on use of special category personal data will continue to apply.
The ICO is set to publish guidance on this in Spring 2026.
Special Category Data and Children
The SoS will have powers to expand the list of special category data and amend processing activities that would form the basis of such processing. This has been introduced to address emerging uses of data in new technology and to ensure the law is future proof in relation to particularly sensitive data. For example, it may be that neural data (i.e. data collected from brainwave activity) ought to be special category data given the ethical considerations around processing this type of data. Similarly, there is an argument that data relating to children requires additional protection as well. The Act introduces a new duty for information society services that are likely to be accessed by children, building on existing obligations under Article 25 of the UK GDPR.
International transfers
The Act sets out a more flexible, risk-based approach to international data transfers. When the SoS is assessing a third country/organisation (i.e. to determine whether it is “safe” for personal data to be sent to that country/organisation), the Act introduces a new “data protection test”. The key here is that the level of protection in a third country/organisation must not be "materially lower" than the UK, as opposed to the previous standard of adequacy. This new test seeks to recognise that other countries’ data protection regimes will not be identical to the UK’s in form and that differences may exist given the cultural context of privacy.
The ICO anticipates it will publish updated guidance on international transfers in winter 2025/2026 to reflect this change, which may include an updated transfer risk assessment template.
Regulatory reform
Under the Act, the ICO will soon be known as the Information Commission and will adopt a new structure containing a board of non-executive and executive members. The Information Commission will still be required to consider certain factors when exercising its functions including promoting innovation and competition, prevention, investigation, detection and prosecution of crimes and public and national security. The Information Commission will also have a duty to regard children’s vulnerability and the fact that children may be less aware of the risks and consequences of personal data processing and of how they can exercise their rights.
What about E-Privacy, Digital ID, and Smart Data?
Cookie Consents
The Act relaxes some of the cookie consent requirements where the privacy risk to users is low. These include non-intrusive cookies (such as those used for analytics and website display) and those used in security (such as preventing or detecting fraud). However, this is not a blanket exemption, and users must still be given information about the purpose for placing the cookies (including the low-risk categories), as well as an ability to opt-out. The aim of these changes is to simplify the cookie regime, reduce the frequency of cookie pop-ups, and improve user experience.
Penalties and Enforcement Powers
Currently, the penalties for e-privacy breaches are a maximum of £500,000. The Act increases these fines to align them to the UK GDPR. This means that breaches of e-privacy rules (including cookie and e-marketing breaches) can attract the maximum penalty of £17.5 million or 4% of worldwide turnover. The aim is to increase consistency across the data privacy enforcement landscape. However, it will inevitably leave more organisations facing exposure to greater fines, especially in areas where the ICO is already very active, such as nuisance calls and texts.
Codes of Conduct
Obligations are imposed on the ICO to encourage representative bodies to design codes of conduct to help with e-privacy compliance. There is also a provision for accreditation bodies to monitor compliance with these codes. The hope is that it will improve organisational efficiency and increase the consistency of the application of privacy rules.
Smart Data
The Act builds on the approach to open banking and creates a framework that aims to ease information sharing between business and regulated/authorised third parties in key sectors such as utilities, transportation, and real estate. Details of sector specific frameworks will be contained in secondary legislation.
Further it sets out common data standards for IT suppliers across the health and social care sector to enable real time data sharing across platforms in order to improve access to healthcare data for public bodies and patients.
Additionally, there are provisions for a new digital register known as the National Underground Asset Register (“NUAR”). The NUAR is a government service which will map all buried infrastructure across England, Wales and Northern Ireland (including gas, electric, water and communication lines) with the aim to improve how such infrastructure is maintained and to create more transparency for businesses and public authorities.
Digital ID
The Act creates a more structured system where providers of digital verification service (“DVS”) can be certified within a trust framework. The DVS trust framework outlines rules and codes of conduct when providing DVS, and a DVS register through which organisations can apply to if they are certified by an accredited body as compliant with the framework. Once registered, the DVS provider will receive a trust mark to enable public recognition. The DVS also establishes an “information gateway” for public authorities to share personal data with DVS providers so that the individual can receive the DVS services; but only if the individual requests it and the disclosure does not breach data protection legislation. The hope is that this will allow better digital ID solutions across different sectors to improve interoperable systems, while easing the administrative burden of the existing paper ID verification process for business and public services. However, the Act goes to lengths to stress that this is not a mandatory national ID system, so the digital ID system is purely voluntary.
Online Safety Research
The Online Safety Act (the “OSA”) is amended to allow the SoS to issue regulations that require providers regulated under the OSA to give online safety researchers access to data from online services, provided any such disclosure does not breach data protection laws. The hope is that this will make it easier for researchers to access data to study online harms and provide a similar framework under UK laws as that under Article 40 of the EU Digital Services Act.
When do the changes come into effect?
A minority of the aforementioned changes came into force on the date of Royal Assent on 19 June (e.g. the requirement for organisations to conduct a reasonable and proportionate search as part of the DSAR process), while the others will go into effect ranging from two months from the date of Royal Assent up until 12 months. As highlighted, the ICO (soon to be IC) will be publishing a range of further guidance throughout the course of the next year as to the changes brought about by the Act which affect the UK data protection landscape. Businesses operating in the UK should use this transitional period to review internal practices and ensure their data governance procedures are in alignment with the Act.