Online Safety Bill: briefing note

Update – 26 October 2023 – Royal Assent  

 The Online Safety Act received Royal Assent on 26 October 2023. The Act incorporates provisions to compel social media companies to remove harmful content from their platforms, including in relation to fraud and terrorism.

Technology Secretary Michelle Donelan said: “Today will go down as an historic moment that ensures the online safety of British society not only now, but for decades to come.”

All of our previous updates are located at the bottom of this briefing note.

Online Safety Bill

3 May 2022

The UK Online Safety Bill is seen as one of the most far-reaching attempts to date to regulate online content. The Bill aims to deliver the UK Government’s manifesto commitment to make the UK the safest place in the world to be online while also defending freedom of speech.

Failure to comply will result in substantial fines (of up to £18 million or 10% of a company's annual global revenue) from Ofcom as regulator or, in extreme circumstances, company directors and senior managers facing prosecution.

Who’s affected?

The duties imposed by the Bill apply to any ‘user-to-user service’ or ‘search service’. Regulated services will therefore include social media networks, search engines and sites such as forums, messaging apps and some online games. If enacted, the legislation will apply to any company which has users in the UK, even if the company itself is not based in the UK.

Background and timeline

The Bill builds on the UK Government’s earlier proposals to establish a duty of care for online providers with an independent regulator to oversee and enforce compliance with the duty, as laid out in its April 2019 White Paper and its December 2020 response to the White Paper consultation.

The Bill was introduced to Parliament on 17 March 2022 following the publication of the draft Bill in May 2021. It will work its way through the House of Commons before being introduced to the House of Lords. Only once it has undergone the scrutiny of both Houses will the Bill receive Royal Assent and become law.

Key aims of the Bill

Under the new legislation, in-scope platforms will need to:

  1. Tackle 'priority' illegal content and activity by removing illegal content, such as terrorist or child sexual abuse and exploitation material.
  2. Protect children from harmful or inappropriate content, such as bullying, pornography and the promotion of self-harm.
  3. Tackle legal but harmful content, such as self-harm or eating disorder content. Social media companies will need to make clear in their terms and conditions what is and is not acceptable on their site, and enforce this.
  4. Safeguard freedom of expression and pluralism online.

Measures and powers under the Bill

The legislation will be wide-ranging with new criminal offences and measures including:

  • Criminal sanctions for tech bosses.
  • Criminal offences for not cooperating with Ofcom, including falsifying or destroying data with offenders’ potentially facing up to two years in prison or a fine.
  • Social media platforms will be required to tackle legal but harmful content with parliament to approve what types of content must be addressed.
  • Ensuring companies tackle illegal or criminal activity online quicker.
  • Combatting fraud by addressing paid-for-scam adverts on social media and search engines.
  • The criminalisation of cyber-flashing.
  • Platforms will also have a duty to report any child sexual exploitation and abuse content they encounter to the National Crime Agency.
  • Ensuring 18+ age verification checks for sites that host pornography.
  • New measures to address anonymous trolls online whilst giving users more control over what they interact with and expose themselves to.

Reaching Royal Assent

Owing to the high levels of interest across all parties and both chambers, the government’s draft Bill was subjected to pre-legislative scrutiny by a joint committee of MPs and peers which is an unusual procedural step. This resulted in 127 recommendations, 66 of which have been adopted in the new Bill.

However, as the Bill is so far-reaching and in light of the controversy it has received to date, the Bill is expected to receive ongoing scrutiny, undergo further changes and ultimately undertake a lengthy passage through parliament.

Update – 21 April 2023 - Online Safety Bill begins Lords Committee Stage

The Online Safety Bill had its first Committee Stage sitting in the House of Lords on 19 April 2023.

Following commitments made in the Commons in January, the UK Government has tabled a number of amendments to the Bill. These concern the introduction of additional priority offences that would strengthen the Bill’s illegal content duties, as well as a provision that would require providers of the largest services to publish summaries of their risk assessments for illegal content and content that is harmful to children.

Outside of Parliament, encrypted messaging services are combining forces to urge the Government to rethink provisions included in the Bill that might require them to scan messages for child abuse content. They argue that weakening end-to-end encryption will undermine privacy of their users and open the door to mass surveillance.

The Bill will be further scrutinised by the House of Lords. In light of the fact that the Bill has already been carried over from the previous session (and parliamentary processes do not allow for a second carry over motion), the Government is expected to make several concessions before the Bill receives Royal Assent. The pressure is on to ensure the legislation is pushed through in time.

Update - 20 January 2023 – The Government concedes on criminal liability for tech bosses

In response to a rebellion of almost 50 Conservative MPs, the Government has been forced to establish criminal liability for senior managers of tech firms who persistently ignore Ofcom’s enforcement notices about failing to shield children from harmful content online.

The Bill has now moved to the House of Lords.

Update - 29 November 2022 - Introduction of a new ‘triple shield’ of protection

Michelle Donelan, Secretary of State for Digital, Culture, Media and Sport, confirmed today that the ‘legal but harmful’ element of the Online Safety Bill has been removed in relation to content accessed by adults and replaced with the ‘triple shield’, being that platforms will be required to:

  • Remove all illegal content.
  • Take down material in breach of their own terms and conditions.
  • Provide adult users with an option to hide potentially harmful materials that they do not wish to see.
  • Businesses will also be expected to explain their age verification processes.

The Government has also announced that they are intending to introduce further amendments that will criminalise 'revenge porn' (including ‘deepfake porn' and ‘downblousing’ images), cyberflashing and coercive behaviour.

The Bill will return to the House of Commons in its new form for Report Stage on 5 December 2022.

Update - 4 November 2022 - ‘Legal but harmful’ provisions toned down

Government drops ‘legal but harmful’ proposals: The ‘legal but harmful’ elements of the Online Safety Bill will reportedly be toned down when it is reintroduced to Parliament later this year, following a four month halt. To balance concerns around impinging freedom of speech and child safety, the ‘legal but harmful’ provisions will only be removed for adults.

Related content