This article was co-authored by Grace Davies, Litigation Assistant, London.
The Online Safety Bill (the Bill) continues to make its way through Parliament, which in due course could introduce major reform of the regulation of online content and providers. The Bill is seen as one of the most far-reaching attempts to date to regulate online content and aims to deliver the UK Government’s manifesto commitment to make the UK the safest place in the world to be online.
Following the inquest into the death of Molly Russell (the Inquest), a 14-year old who took her own life in 2017 after viewing harmful content online, social media companies continue to face mounting pressure to regulate their services.
In parallel, Ofcom, the appointed regulator for online safety, makes preparations for enforcement of the Bill, including its recent Call for Evidence which we explore in more detail below.
The Bill following conclusion of the Inquest
Self-regulation vs need for legislation
A number of concerns were raised in the coroner’s Regulation 28 Report to Prevent Future Deaths following the Inquest into Molly’s death. Further, the coroner noted: “although regulation would be a matter for government I can see no reason why the platforms themselves would not wish to give consideration to self-regulation taking into account the matters raised above. Baroness Kidron OBE, speaking at the second reading of the Bill in the House of Lords stated: “many platforms have upped their game since, but the need for this legislation has not diminished”.
In written evidence submitted to the Chairs of the Online Safety Bill Committee, it agreed with the intentions behind the Bill, and set out its support for making it as effective and workable as possible. However, Meta detailed a number of concerns in relation to the unintended consequences of the drafting of the Bill, referring to concepts being poorly defined and contradictory requirements. In particular, it submitted that parts are overly complex, ambiguous and risk undermining user privacy, as well as criticising the “considerable powers” for ministers, which lack judicial oversight, to direct the regulator Ofcom.
Criminal liability
The legislation will be wide-ranging and will introduce a number of new criminal offences. As part of this, and in response to a rebellion of almost 50 Conservative MPs in relation to an amendment targeting tech bosses who may flout the rules, the UK Government has established criminal liability (and the imposition of potential prison sentences of up to two years), for senior executives of tech firms who persistently ignore Ofcom’s enforcement notices about failing to shield children from harmful content online.
In essence, the amendment makes it an offence for service providers not to comply with the relevant safety duties protecting children, and where this was committed with the “consent or connivance” of a senior manager or officer, or it was “attributable to their neglect”, that officer would also be guilty of the offence. This language is similar to other regulatory offences and the concepts have been the subject of lengthy legal analysis.
Ofcom’s Call for Evidence – Second phase of online safety regulation: Protection of children (Call for Evidence)
Whilst the Bill continues to make its way through UK Parliament to become law, Ofcom continues its preparations for regulation.
Following its Roadmap to Regulation published in July 2022, which sets out an initial plan for implementation of online safety regulation, and its initial Call for Evidence on the first phase of online safety regulation, in January 2023, Ofcom published its second Call for Evidence. This has a particular focus on:
- The protection of children from legal but harmful content.
- The ability to access and assess access to online platforms where children may come across such content.
- How associated risks can be assessed, managed and mitigated.
The call for evidence, which closed on 21 March 2023, sought to collate evidence in order to assist with the preparation of codes of practice relating to the protection of children, including risk assessment guidance and how online platforms can comply with their duties. It categorised content as follows:
- 'Primary priority content', such as: pornography; content which promotes self-harm or eating disorders; and legal suicide content.
- 'Priority content', such as: online abuse, cyberbullying and harassment; harmful health content including misinformation; and content depicting or encouraging violence.
In terms of next steps, Ofcom expects to publish its second consultation in Autumn 2023 based on the information gathered from this Call for Evidence.
Comment
In addition to the launch of Ofcom’s most recent Call for Evidence, the Government is working alongside the regulator in preparing for the legislation to come into force. It has indicated that a phased approach will be taken in relation to the duties set out in the Bill and accordingly, Ofcom’s powers. Although enforcement of the legislation is some way off, we can expect illegal content to be tackled first, i.e. the removal of terrorist or child sexual abuse and exploitation material, before addressing the legal but harmful content, dealing with the most harmful first.