Expert Insights

Expert Insights

Draft Online Safety Bill: Regulating the online world

On 12 May 2021, the UK government published the draft Online Safety Bill (the “Bill”) which establishes a new regulatory regime to address illegal and harmful content online.  The Bill imposes a duty of care on specified online service providers to assess, monitor and take action against illegal or harmful online content.  The penalties for non-compliance are high, with Ofcom (the UK’s communications regulator) gaining new online safety powers to fine businesses up to £18 million, or 10 per cent of qualifying revenue, if they fail in their new duty of care. 

What services are captured?

In order to tackle the growing concern of online safety, the Bill expands the traditional view of liability for content sitting with publishers, to capture intermediaries that facilitate the sharing of content online.  Services regulated by the Bill include:

  • “user-to-user services” meaning any online service which allow users to share and upload user-generated content (i.e. social media providers); and
  • “search services” including search engines which enable users to search multiple website and databases online.

The territorial reach of the Bill is widely defined and will apply to service providers within the UK but also to services based outside the UK with ”links to the UK”. 

What are the duties?

The Bill takes a risk-based approach by creating three categories of services, each subject to a different level of regulation.  Duties will vary depending on the type of organisation and the category in which the organisation falls.  Duties imposed on all online service providers include:

  • completion of an illegal content risk assessment;
  • mitigation and management of risks caused by illegal or harmful content;
  • duties to protect the right to privacy and freedom of expression;
  • the introduction of processes to facilitate reporting and redress for users; and
  • a requirement to keep records demonstrating compliance with the Bill.

Additional duties apply to organisations which are “likely to be accessed by children” or which classify as a category 1 service.   There are threshold conditions which must be met in order to fall within each category.  However, the threshold conditions are not yet published and will be set out in supplementary legislation made by the Secretary of State.

There are a range of exempt services outlined in the Bill including email services, text messaging services, intermediary services enabling online reviews as well as user-to-user services provided by foreign States or public authorities discharging public functions.

What next?

The Bill will now be subject to pre-legislative examination by a joint committee of Members of Parliament on its road to enactment.  If the Bill is enacted it will come into force without a transition period and businesses should be ready to comply with the regulation.  

Commentary

The Bill takes a risk-based approach to regulation that introduces proportionality to the online safety obligations of online service providers.  While this may benefit smaller online businesses who may be able to take a proportionate approach to the regulatory burden, it also increases the uncertainty and lack of clarity on how businesses can comply with the Bill.  While the Bill includes definitions for what may constitute harmful content, ambiguity remains for businesses trying to determine when content is harmful and to whom.  Soon to be published supplementary codes of practice may provide assistance in this regard but businesses will still have to decide not only what content is illegal, but also what content may be harmful to their users. In addition, the Bill imposes obligations on businesses to create mechanisms and processes for reporting and documenting their compliance, creating a significant regulatory burden.

The debate remains whether online service providers should be the gatekeepers of online safety.   There are ongoing concerns of excessive censorship and online monitoring associated with the enhanced duties on online service providers to pro-actively monitor user content and mitigate risk, including taking down user content.  Businesses will inevitably find it difficult to mitigate and monitor online harmful content while at the same time safeguard user’s rights to freedom of expression and right to privacy.

Online service providers have grown and pervaded communication and social interaction in an increasingly digital world and have done so in an environment where regulation has not traditionally kept up.  The Bill is addressing the undeniably important issue of online harm in today’s digital world but it remains to be seen whether the Bill strikes the right balance between protection and privacy. 

Our thinking

Share this Page

TOP