The protection of children by the Online Safety Bill: is it too late for some?


This Bill is currently progressing through Parliament. For updates, please read our briefing note.

Former UK Prime Minister, Liz Truss, recently confirmed that the UK Online Safety Bill (the Bill), one of the most far-reaching attempts to date to regulate online content, would  require “some tweaks” but that it would be taken forward in parliament to make sure that the under-18-year-olds are protected from harm online. Current PM, Rishi Sunak, who has taken over progression of the Bill has vowed to make substantial progress before Christmas.

As stakeholders around the world keep watch over the progress of the draft legislation through parliament, the aim of which is to make the UK the safest place in the world to be online, so too did many follow the inquest of Molly Russell (the Inquest), a 14-year-old who took her own life in 2017 after viewing harmful content online. As the coroner recently concluded that such content is “likely” to have contributed to Molly’s death, we consider the reach of the Inquest on the forthcoming legislation as the government faces increasing pressure to strengthen the Bill’s measures aimed at protecting children.

The Bill

The Bill builds on the UK Government’s earlier proposals to establish a duty of care on regulated online service providers. When enacted, Ofcom, the UK’s communications regulator, will be enabled to levy substantial fines of up to £18 million or 10% of a company’s annual global revenue, as well as having powers to prosecute company directors and senior managers in relation to communications offences committed online. 

As currently drafted (but noting that the Bill is expected to re-visit the House of Commons before Christmas), the regulated service providers will need to proactively regulate content on their platforms and protect users from being exposed to illegal and legal, but potentially harmful material, by taking proportionate measures to mitigate and manage the risks of harm.

Specifically in relation to children, in-scope platforms will need to protect them from harmful or inappropriate content, such as bullying, pornography and the promotion of self-harm, and tackle legal but harmful content, such as self-harm or eating disorder content, and will be required to:

  • Mitigate and manage the risks and impact of harm to children in different age groups.
  • Prevent children from encountering ‘primary priority content’ and any non-designated content, for example, by using age verification processes such as asking for credit card details.
  • Set out in terms and conditions how children are to be prevented or protected from encountering harmful content.

The Bill continues to face mounting pressure, including from Ian Russell, Molly’s father and Chief Executive of Molly Rose Foundation, who joined a host of expert witnesses to give his views to the Joint Committee on the Bill in Parliament in September and who has more recently called on the government to “urgently deliver its long-promised legislation”.

The Inquest

Senior Coroner, Andrew Walker, said that the Inquest should serve as a catalyst for protecting children from the risk that the internet has brought into family homes. Having watched graphic videos and content “of the most distressing nature” during the Inquest, the Coroner said the footage “appears to glamorise harm to young people” and labelled the Inquest an opportunity to “make this part of the internet safe”. Listed among his concerns were lack of content regulation, lack of age verification and the use of algorithms.

At the Inquest, during evidence given by a senior executive at Meta, it was accepted that Instagram had shown Molly posts that breached its content guidelines. Similarly, a senior executive at Pinterest accepted that the platform was previously “not safe” and also offered an apology. Whilst the purpose of an inquest is not to try a criminal offence or to decide civil law liability, the relationship between these comments and the criminal offences encapsulated in the scope of the Bill has been brought into question as at a later date when the Bill is implemented, companies and/or senior management, if prosecuted, will be faced with entering ‘guilty’ or ‘not guilty’ pleas to criminal charges.  

Separately, the Inquest heard evidence relating to how, in the right circumstances, certain self-harm content can be shared in a positive context to “facilitate the coming together to support” other users. Legal representation for the Russell family questioned whether an average 13-year-old child could differentiate between material creating awareness of self-harm, as opposed to material encouraging or promoting such behaviour. It may be that these difficult questions posed at the Inquest sit alongside the wider ongoing debate. Although Ofcom has announced that it will provide guidance and codes of practice on how regulated service providers can comply with their duties, clarification as to where the line will be drawn will remain unclear until a time the legislation is implemented, fleshed out by secondary legislation, and enforced in practice by the regulator. 

Regulation 28 Report to prevent further deaths

Following the conclusion of the Inquest, the Senior Coroner issued a report on 13 October 2022 which made a number of recommendations including that consideration is given by the UK Government to: “reviewing the provision of internet platforms to children, with reference to harmful on-line content, separate platforms for adults and children, verification of age before joining the platform, provision of age specific content, the use of algorithms to provide content, the use of advertising and parental guardian or carer control including access to material viewed by a child and retention of material viewed by a child”.

Whilst acknowledging that the overarching legislation is a matter for government, the senior coroner states that he: “can see no reason” why the online platforms themselves would not wish to give consideration to self-regulation.

The relevant parties have a duty to respond to the report by 8 December 2022. 


The UK Government faces increasing pressure to drive the Online Safety Bill forward and strengthen its measures to protect children online, and the media interest in the Inquest has no doubt added to this. Stakeholders should expect amendments to the Bill, including the removal of ‘legal but harmful’ adult content from the regulation regime. It is unlikely that rules relating to children’s safety will be compromised and in fact further amendments may be introduced such as to help bereaved parents access information about social media companies.

We also expect that the Report to Prevent Future Deaths will prompt a number of social media companies to strengthen their policies in relation to the safety of children online, in an attempt to pre-empt the legislative framework that is to come.

Stakeholders should also bear in mind that although the companies on this occasion were questioned in an inquest setting, once the Bill is in force, scrutiny will be against a backdrop of criminal offences, and a matter for the regulator and criminal courts. 

Related items:

Read other items in Crime and Regulatory Brief - December 2022

Related content