Briefing note

Online Safety Bill

Update - 29 November 2022

Michelle Donelan, Secretary of State for Digital, Culture, Media and Sport, confirmed today that the ‘legal but harmful’ element of the Online Safety Bill has been removed in relation to content accessed by adults and replaced with the ‘triple shield’, being that platforms will be required to:

  • Remove all illegal content.
  • Take down material in breach of their own terms and conditions.
  • Provide adult users with an option to hide potentially harmful materials that they do not wish to see.

Businesses will also be expected to explain their age verification processes.

The Government has also announced that they are intending to introduce further amendments that will criminalise revenge porn (including ‘deepfake’ porn and ‘downblousing’ images), cyberflashing and coercive behaviour.

The Bill will return to the House of Commons in its new form for Report Stage on 5 December 2022.

Previous updates

Update - 4 November 2022

Government drops ‘legal but harmful’ proposals: The ‘legal but harmful’ elements of the Online Safety Bill will reportedly be toned down when it is reintroduced to Parliament later this year, following a four month halt. To balance concerns around impinging freedom of speech and child safety, the ‘legal but harmful’ provisions will only be removed for adults.

The UK Online Safety Bill is seen as one of the most far-reaching attempts to date to regulate online content. The Bill aims to deliver the UK Government’s manifesto commitment to make the UK the safest place in the world to be online while also defending freedom of speech.

Failure to comply will result in substantial fines (of up to £18 million or 10% of a company's annual global revenue) from Ofcom as regulator or, in extreme circumstances, company directors and senior managers facing prosecution.

Who’s affected?

The duties imposed by the Bill apply to any ‘user-to-user service’ or ‘search service’. Regulated services will therefore include social media networks, search engines and sites such as forums, messaging apps and some online games. If enacted, the legislation will apply to any company which has users in the UK, even if the company itself is not based in the UK.

Background and timeline

The Bill builds on the UK Government’s earlier proposals to establish a duty of care for online providers with an independent regulator to oversee and enforce compliance with the duty, as laid out in its April 2019 White Paper and its December 2020 response to the White Paper consultation.

The Bill was introduced to Parliament on 17 March 2022 following the publication of the draft Bill in May 2021. It will work its way through the House of Commons before being introduced to the House of Lords. Only once it has undergone the scrutiny of both Houses will the Bill receive Royal Assent and become law.

Key aims of the Bill

Under the new legislation, in-scope platforms will need to:

  1. Tackle 'priority' illegal content and activity by removing illegal content, such as terrorist or child sexual abuse and exploitation material.
  2. Protect children from harmful or inappropriate content, such as bullying, pornography and the promotion of self-harm.
  3. Tackle legal but harmful content, such as self-harm or eating disorder content. Social media companies will need to make clear in their terms and conditions what is and is not acceptable on their site, and enforce this.
  4. Safeguard freedom of expression and pluralism online.

Measures and powers under the Bill

The legislation will be wide-ranging with new criminal offences and measures including:

  • Criminal sanctions for tech bosses.
  • Criminal offences for not cooperating with Ofcom, including falsifying or destroying data with offenders’ potentially facing up to two years in prison or a fine.
  • Social media platforms will be required to tackle legal but harmful content with parliament to approve what types of content must be addressed.
  • Ensuring companies tackle illegal or criminal activity online quicker.
  • Combatting fraud by addressing paid-for-scam adverts on social media and search engines.
  • The criminalisation of cyber-flashing.
  • Platforms will also have a duty to report any child sexual exploitation and abuse content they encounter to the National Crime Agency.
  • Ensuring 18+ age verification checks for sites that host pornography.
  • New measures to address anonymous trolls online whilst giving users more control over what they interact with and expose themselves to.

Reaching Royal Assent

Owing to the high levels of interest across all parties and both chambers, the government’s draft Bill was subjected to pre-legislative scrutiny by a joint committee of MPs and peers which is an unusual procedural step. This resulted in 127 recommendations, 66 of which have been adopted in the new Bill.

However, as the Bill is so far-reaching and in light of the controversy it has received to date, the Bill is expected to receive ongoing scrutiny, undergo further changes and ultimately undertake a lengthy passage through parliament.

Related items: