• news-banner

    Expert Insights

AI and Dispute Resolution: Managing Legal Risk in an Evolving Landscape

 

The use of AI by businesses in their services, operations and products has increased exponentially over the past few years, but as businesses seek to unlock the benefits of AI there exists a potential litigation risk that parties should be aware of.

The complexities of AI systems and the commercial agreements that govern their usage mean that the potential for disputes is enhanced: be those disputes between contracting parties over the use of AI systems or legal liabilities that utilisation of the system could create in relation to third parties.

Contractual disputes arising between AI suppliers and customers

Disputes between providers of AI services and their customers are likely to increase as the utilisation of technology becomes more embedded in the day to day operations of businesses. Contractual disputes between customers and the suppliers of the AI systems can be complicated by the following factors:

  • difficulties pursuing claims against a supplier for breach of contract due to uncertainties over deliverables;
  • uncertainties as to what constitutes a breach of the agreement (for example failure to conform with specification / objectives / standards agreed for the system itself);
  • alternative breach such as fault with the data set on which the AI is trained rather than fault with AI system itself;
  • complex issues of causation and loss; 
  • risk of reliance on implied terms (for example software is not a ‘good’ with the associated 
  • implied terms unless supplied in its stored medium); and
  • the extent of indemnities given as to losses caused to third parties. 

These issues and measures that could be taken to obliviate the underlying concerns are considered further below.

Liability for AI errors will of course depend on context, but the tort of negligence and implied statutory terms may be likely to give rise to further causes of action, unless and until specific AI liability legislation is introduced.

It is worth noting that the European Commission has considered this issue at length as part of reform to its product liability regime. However, having proposed AI specific legislation adapting non-contractual faultbased civil liability rules to AI, it has now indicated that this will not proceed.

Care should be taken when drafting agreements to keep abreast of legislative changes to ensure commercial agreements remain up to date. 

Liability to third parties arising from the use of AI systems

As well as pure inter-party disputes, there is also potential that utilisation of an AI system could (in certain circumstances) create liabilities to third parties. This has been touched on elsewhere in this Guide but may include areas such as:

  • IP infringement;
  • Data Protection breaches;
  • Discrimination (employment);
  • Product liability;
  • Regulatory breaches / concerns.

In the area of product liability, the EU has recently passed a revised Product Liability Directive which lays down common rules on the liability of economic operators for damage suffered by natural persons caused by defective products, including certain software and AI enabled products.

These disputes will all be, of course, fact dependant but before implementing any AI system care must be taken to consider how the system will be used and where the risk for that utilisation lies. 

Minimising contractual and operational risk in AI projects

Tips to minimise risks at the contract drafting stage include: • agree correct forum in the contract for the AI system;

  • agree the standards / specification / objectives relating to the AI system with which the AI system should conform / perform;
  • good governance and thorough review by the customer;
  • maintain clear records and engage with relevant processes (relevant to causation and loss); and
  • agree robust warranties and indemnities in the contract to minimise impact on customer/supplier (as the case may be) and deal with potential liabilities to third parties. 

Resolving AI disputes through arbitration and ADR

In terms of agreements concerning the use of AI, care should be taken as to what forum is being selected for the resolution of disputes.

Resolving disputes via arbitration versus national courts should be considered as the confidentiality of arbitral proceedings may be beneficial in terms of keeping the usage of AI by businesses out of the public court system.

Increasingly arbitral institutions are bringing in bespoke rules that deal with AI disputes (see for example the JAMS AI Disputes 2024) that may help with resolving the disputes expeditiously. These rules often look at keeping business sensitive data as confidential as possible whilst allowing tribunals to rule on the matters in dispute.

Other forms of ADR (alternative dispute resolution) should also be considered. UK courts will be keen to establish that there has been an attempt to resolve a dispute before resorting to the court system. At some point we may also need to be consideration whether the dispute itself should be decided by AI

Managing AI-related dispute risk

Managing AI-related legal risks requires a proactive, multi-layered approach as UK regulations evolve. Key concerns include data protection, IP rights, contractual liability, and algorithmic bias. Organisations must comply with laws like UK GDPR and the Data Protection Act 2018, especially when AI handles personal data or makes automated decisions.

IP disputes are increasing, particularly around copyrighted content used in AI training. Clear contracts are vital to define responsibilities when AI systems fail or produce biased results. Legal teams should adopt governance frameworks, conduct fairness audits, and participate in regulatory sandboxes. As AI grows more autonomous, assigning accountability remains a major legal challenge.

What’s next for AI and Dispute Resolution law?

AI is set to transform dispute resolution law, blending tech innovation with evolving UK regulations. As tools like generative AI assist with drafting, research, and arbitration, legal professionals face challenges around accountability, transparency, and fairness.

UK firms are cautiously adopting AI to boost efficiency while maintaining human oversight to prevent errors. The future points to hybrid models where AI supports—but doesn’t replace—human judgment. Regulators are expected to introduce clearer guidelines on ethical use, data integrity, and admissibility of AI-generated evidence. The key challenge remains balancing innovation with justice and due process.

AI-brochure-promo

AI: Business Guide

The AI revolution is upon us, and it is transforming every sector, including law.

Find out more

Our thinking

  • Jamie Cartwright writes for Independent Schools Magazine on how VAT on private school fees is shaping the future of the independent education sector

    Jamie Cartwright

    In the Press

  • Magnum spins out of Unilever: a clearer investment story but a cool valuation

    Iwan Thomas

    Quick Reads

  • Licence to Till: what happens when a ‘Grazing Licence’ is really a tenancy? Accidental tenancies, shams and documents that just don’t do what they say on the tin…

    Maddie Dunn

    Insights

  • Georgina Muskett writes for Property Week on the conundrum of green leasing

    Georgina Muskett

    In the Press

  • Paramount launches hostile bid for the entirety of Warner Bros

    Grace Hudson

    Quick Reads

  • Property Patter: Top 5 Changes under the new Renters’ Rights Act 2025

    Lauren Fraser

    Podcasts

  • DMCCA: What the UK’s new consumer rules now mean for consumer facing businesses

    Mark Dewar

    Insights

  • Transactions at an undervalue: trusts of land

    Roger Elford

    Insights

  • Ministry of Sound Limited v. The British Foreign Wharf Company Limited (and ors): Balancing terms of a renewal lease with redevelopment potential

    Grace O'Leary

    Quick Reads

  • Charles Russell Speechlys advises FIRST and its shareholders on sale to Encore

    Mark Howard

    News

  • Charles Russell Speechlys advises longstanding client Puma Growth Partners on its investment in HubBox

    Ashwin Pillay

    News

  • Candy Kittens takes a bite as Unilever slims down

    Iwan Thomas

    Quick Reads

  • Autumn Budget 2025 – Inheritance Tax (IHT) and charitable gifts

    Richard Honey

    Insights

  • Advocacy: Lessons from The Mandela Brief for International Arbitration Today

    Jue Jun Lu

    Events

  • The Times, City AM and the Daily Mail quote Dan Pollard on government plans to remove the cap on unfair dismissal claims

    Dan Pollard

    In the Press

  • Promises and probate: when is “detriment” worth the family farm and what happens when a promise is only relied on for a defined period?

    Matthew Clark

    Insights

  • Retail Showcase - Festive Special

    Events

  • Property Week quotes Andrew Ross on the case of Romal Capital v Peel Holdings

    Andrew Ross

    In the Press

  • James Stewart writes for Tax Journal on changes to the share exchanges and reorganisation rules in the 2025 Budget

    James Stewart

    In the Press

  • Building Safety Lookahead: 2026 will see the reform of the BSR, introduction of the Building Safety Levy and more

    Michael O'Connor

    Insights

Back to top