The UK Online Safety Act (OSA), which received Royal Assent in October 2023, is one of the most significant digital regulation laws introduced in the UK to date. As of 17 March 2025, the first set of enforceable duties under the OSA came into force. These relate to illegal harms, requiring in-scope services to conduct risk assessments and implement safety measures to protect users from illegal content.
While attention has often focused on its long-term implications for child safety and harmful content, the immediate enforceability of illegal harms duties presents a critical compliance moment for regulated services. This article provides an overview of the current legal position, explains Ofcom's expectations under the new Codes of Practice, and sets out what steps businesses should now take to remain compliant.
In this article, we will focus on The Scope, Categorisation, and the Illegal Harms Duties (Section 1) and The Codes of Practice, Ofcom Enforcement, and Strategic Risk Areas (Section 2).
Scope, categorisation, and the illegal harms duties
To understand the practical reach of the Online Safety Act, it's essential to first clarify which services are caught by the regime and what duties now apply.
Which services are in scope?
The OSA applies to two categories of regulated services: user-to-user (U2U) and search services. These include:
- social media and messaging platforms;
- online forums and community sites;
- gaming services with user chat functions;
- search engines;
- marketplaces that host user-generated content.
The OSA has extra-territorial reach. Services based outside the UK must comply if they have a significant number of UK users or target UK users in their design or promotion.
Illegal harms duties: March 2025 enforcement milestone
From 17 March 2025, services must comply with their duties to mitigate exposure to illegal content. In particular, they must:
- identify and assess risks of illegal content (risk assessment deadline was 16 March 2025).
- implement proportionate safety measures to mitigate those risks.
- follow Ofcom’s Codes of Practice or implement equivalent alternative measures (and be able to evidence their effectiveness).
The duties cover a wide range of illegal content, including terrorism, CSAM (child sexual abuse material), hate speech, fraud, and other criminal conduct.
Categorisation thresholds: Who has enhanced duties?
On 27 February 2025, the UK government published threshold conditions dividing services into three categories:
- Category 1 (large user-to-user services): over 34 million UK users (or 7 million if content sharing and recommender systems are present).
- Category 2A (large search services): over 7 million UK users with broader search scope.
- Category 2B (user-to-user with private messaging functions): over 3 million UK users with direct messaging capability.
These thresholds trigger enhanced obligations such as transparency reporting, user empowerment tools, and child risk assessments. Providers must determine whether they fall within these categories and prepare accordingly.
Codes of practice, Ofcom Enforcement, and strategic risk areas
With the scope and duties under the Act now established, the next step for businesses is to understand how compliance will be measured in practice, beginning with Ofcom’s Codes of Practice.
Codes of practice: a compliance roadmap
Ofcom published its first Codes of Practice in late 2024, setting out how regulated services should meet their illegal content safety duties. The Codes are not mandatory, but services deviating from them must prove that their alternative measures achieve equivalent outcomes.
Key requirements in the Codes include:
- having a structured governance process for risk assessment and mitigation.
- monitoring and moderation of high-risk features (e.g., anonymous posting, live streaming).
- proactive measures against priority illegal content (e.g., terrorism and CSAM).
- user reporting and redress systems.
- clear and accessible terms of service explaining how illegal content is handled.
Regulated services should treat the Codes as the default compliance pathway unless they have a compelling, evidence-based reason to do otherwise.
Enforcement and the CSAM enforcement programme
Ofcom has announced that it is ready to use its enforcement powers, including issuing:
- Information notices;
- Provisional or final enforcement notices;
- Penalties of up to £18 million or 10% of global turnover.
On 17 March 2025, Ofcom also launched a targeted enforcement programme focused on file-sharing and file-storage providers, particularly where there is elevated risk of exposure to CSAM. These providers have received requests for information and are being evaluated against their risk assessments and safety measures.
In parallel, the Online Safety (CSEA Content Reporting) Regulations 2025 (in force from 3 November 2025) impose duties to report CSAM to the NCA and retain relevant information securely.
Strategic risks: platform design, AI, and transparency
The OSA dovetails with UK GDPR and the Age Appropriate Design Code. For platforms using algorithmic recommendation systems or AI-generated content, duties to ensure transparency and prevent amplification of illegal content are particularly important.
Practical areas of focus include:
- risk profiling of recommender systems;
- content moderation at scale;
- age assurance and age verification technologies;
- alignment of risk assessments with DPIAs and Article 30 records;
- cross-border data governance where content moderation is handled outside the UK.
Practical recommendations
- Confirm in-scope status and categorisation: All digital services should re-evaluate their exposure under the OSA. Even small platforms may be subject to baseline duties.
- Conduct robust risk assessments: Ensure that illegal content risk assessments are complete, documented, and reviewed regularly. Include technical, human, and systemic risks.
- Implement controls aligned with the Codes: Follow the Ofcom Codes where possible. If using alternative approaches, maintain detailed justification and effectiveness metrics.
- Integrate with existing governance: Align OSA compliance with existing UK GDPR, PECR, and AI governance frameworks. Risk assessments and mitigation measures should not be siloed.
- Prepare for enforcement: Maintain internal audit trails, version-controlled risk assessments, and senior oversight records. Be ready to respond to Ofcom information requests.
Comments
The OSA has now moved from principle to enforcement. The introduction of enforceable illegal harms duties marks a regulatory inflection point for digital services operating in or targeting the UK.
In-scope services must now treat online safety as a live regulatory issue, not a compliance box-ticking exercise. With Ofcom's enforcement powers now operational, a failure to engage with the Codes of Practice or to evidence robust internal controls could result in significant financial, legal, and reputational consequences.