The Online Safety Act (UK): Contents, Implementation, and Compliance

Reading Time: 5 minutes

Authors:  

Benjamin Daley | Data Protection Advisor

Date: 22 November 2023

The controversial Online Safety Act (OSA) came into force as UK law on 26th October, presenting online service providers with new obligations to prevent and remove harmful content from their platforms. The regulator, Ofcom, has received enhanced enforcement powers and published its implementation approach with envisaged timelines for each stage. Organisations of any size providing internet user-to-user and search services (“regulated services”) must therefore respond to the Act’s requirements and introduce measures to ensure compliance, ahead of regulatory implementation, to avoid penalties before the enforcement deadline of Q3 2026. This article seeks to explain key and controversial regulatory reforms in the OSA, implementation timeframes, and measures to support compliance. 


Key provisions
 

The 2020 Online Harms White Paper called for an Online Safety Bill as drafted in 2021, since developed into the OSA, to govern digital services. Public commentary has focused on child safety and user empowerment, through effective digital rights management against online harms. Ofcom will focus on preventing unwitting exposure to abusive material, age restrictions on pornography, tackling terrorist content and deterring online fraud. 

The OSA has a zero-tolerance approach to child protection, ensuring that all online service providers: 

  • remove illegal content quickly or prevent it from appearing in the first place, 
  • prevent children from accessing harmful and age-inappropriate content, 
  • enforce age limits and use age-checking measures on platforms where content harmful to children is published, 
  • are more transparent about the risks and dangers posed to children on their sites, and 
  • provide parents and children with clear and accessible ways to report problems online when they do arise. 

To achieve this, the OSA has equipped Ofcom with enhanced investigatory powers. The contentious ‘spy clause’ has been retained under the section 121 notice, originally Clause 122 in the Bill, with appropriately dubbed ambiguity. This notice requires “regulated service” providers to use “accredited technologies” to scan messages and files that are otherwise end-to-end encrypted. Although the intention is noble – to detect and remove terrorism and Child Sexual Exploitation and Abuse (CSEA) communications – this nevertheless raises serious concerns from privacy and security perspectives. Tech platform providers, academics and advocacy groups have been vocal in their opposition to mandated surveillance backdoors in software – arguing that this undermines encryption standards and privacy rights, by allowing access to otherwise end-to-end secured private data, with such serious implications that some have threatened to leave the UK market altogether. 

It remains to be seen how the OSA will interact with UK GDPR and other relevant pieces of legislation, given such overlapping privacy governance provisions and risks. Furthermore, due to the extraterritorial scope of UK GDPR, organisations complying with OSA are likely to apply these obligations to other regions of operations – in line with their global privacy programme design. Alternatively, the risks posed to the fundamental right to privacy may prove insurmountable for some organisations, instead developing a standalone UK region OSA-compliant online service. 

Any breaches of the OSA are subject to Ofcom-enforced penalties of up to £18 million or 10% of an organisation’s global revenue, whichever is greater, emphasising the importance of compliance to organisations of all sizes. Furthermore, under OSA, senior managers can be held liable for a platform’s failure to comply with Ofcom’s CSEA requirements and confirmation decisions, with custodial sentencing powers for the most serious offenders. 


Implementation
 

Ofcom’s focus is not on censorship but rather the design and governance of online platforms, allowing organisations the opportunity to build systems that support free speech in safe environments. The UK may draw upon the EU Digital Services Act approach in developing a broad governance framework with surrounding guidance and commentary. 

Ofcom have taken a three-phase approach to implementation, with active consultations throughout, for the following enactments: 

  1. Phase 1: illegal harms
    The first guidance, on illegal content duties, is planned to come into force as the Illegal Harms Codes in Q4 2024. 
  2. Phase 2: child safety duties
    The second guidance, on age verification for online pornography content, and third guidance, on children’s access and risk assessments, are to be enacted as the Children’s Codes of Practice in Q3 2025.
    The fourth guidance, on the protection of women and girls, is to be made available for consultation in Q2 2025 without formally scheduled codification. 
  3. Phase 3: duties on categorised services
    Some providers will have additional duties, outlined in OSA s7, under the as-yet undefined OSA s95 categories 1, 2A, and 2B. Scheduled guidance will determine the threshold conditions for each category, with the Register of Categorised Services to be published in Q4 2024, followed by the Categorised Services Codes coming into force in Q1 2026. 

The Department for Science, Innovation and Technology (DSIT) are to provide greater clarity on registration fees, to be paid by providers of “regulated services” whose qualifying worldwide revenue meets the threshold, in Q1 2024. This is expected to come into force in Q2 2025 and will fund the online safety regime. Ofcom charging principles are to be published in Q3 2025 and enforcement activities will commence in collaboration with DSIT through secondary legislation in Q3 2026. 

The roadmap and timelines for these phases have already been revised previously, in line with parliamentary processes, and Ofcom have acknowledged that these are likely to change again as engagement activities commence. Therefore, the above timescales offer guidance but are not to be understood as set deadlines – organisations should note these dates with an asterisk, as subject to change and requiring monitoring. 

However, some providers seek to get ahead of the curve of online regulations. Meta is now offering ad-free subscriptions at a monthly cost, in light of a €390 million fine over an inappropriate contract legal basis to deliver ad services. Meta argues that the new service offering supports individual rights and compliance requirements, whilst recouping some of the financial damage and loss of earnings from its ad business. Alternatively, this has highlighted risks that such division of services can be viewed as commodifying fundamental rights, thus codifying entrenched social divisions into the architecture of the internet. Providers must therefore ensure that individual rights are actionable, activities are fully compliant with legal requirements, and service users are fully protected from online harms at all times – for all OSA “regulated services”. 


Compliance activities
 

Organisations of all sizes should consider relevant and appropriate activities to identify their compliance with OSA, any gaps, and mitigating measures. These may include: 

  • Engagement: As OSA is in early implementation stages, organisations should utilise the opportunity to engage with Ofcom consultations on its execution. This will better inform the codes of practice, allow the organisation to understand its role in OSA and any categorised services, and relationship with Ofcom. 
  • Gap analysis: A gap analysis should be conducted, including Data Protection Impact Assessments for new online services, and updated in line with the identified implementation stages. Organisations can therein demonstrate proactive compliance activities, altering online service offerings and content moderation practices accordingly, before OSA obligations come into effect. 
  • Policies, protocols, and procedures: Specific organisation policies, procedures, and protocols may be drafted and amended in response to OSA obligations. These can include an acceptable use policy, child safety protocols, and incident reporting procedures. This supports a data protection by design and default approach, and OSA safeguarding measures. 
  • Develop safe technologies: Data protection by design requires organisations to develop OSA-compliant technology by default. Our award-winning tool CESIUM, as one example, significantly enhances safeguarding processes across multiple environments whilst embedding privacy and cybersecurity controls throughout. Trilateral are accordingly at the forefront of discussions on designing “accredited technologies” under the OSA, offering expert opinion on the safe development and deployment of such tools. 

In an everchanging online regulatory landscape, it is critical to remain aware of and compliant with legislative requirements. The OSA will bring a change to online practices, and public expectations for online service providers’ privacy and cybersecurity infrastructure, with its scope and reach yet to be fully understood. Our Data Protection and Cyber Risk team have broad expertise in advising various organisations on compliance measures, with experienced colleagues well placed to advise on technical design alongside ethical concerns. Please get in touch to discuss your requirements.

Related posts

Let's discuss your career