Online Safety Bill: a balance to be struck
On 17 March 2022, the UK Government delivered to Parliament a 'world-leading' Online Safety Bill aimed to make the UK the safest place to go online while defending free expression.
The Bill was originally published in draft form in May 2021, but later underwent 'significant improvements' to tighten the content, provide greater clarity and cover more offences. The Bill will no doubt face further scrutiny as it continues its legislative journey under a new prime minister.
Both Liz Truss and Rishi Sunak have committed to revisit the ‘legal but harmful’ element of the Bill (which will be approved by Parliament), to ensure the legislation does not undermine freedom of speech. Moving forwards, stakeholders should expect the Bill to be amended when it is picked up again following Parliament’s summer recess.
Here, and in light of Ofcom’s Call for Evidence about its plans for implementation of online safety regulation, we focus on the scope of the Bill’s provisions and the likely impact on organisations. A careful balance must be struck between providing adequate protection whilst also promoting innovation and investment in the UK’s digital economy.
The new duty of care
The Bill aims to provide UK internet users with a safer digital experience by imposing a new statutory duty of care on regulated service providers. Websites, apps and other software will be caught under the Bill.
As currently drafted, these service providers will need to proactively regulate content on their platforms and protect users from being exposed to illegal and legal, but potentially harmful material, by taking proportionate measures to mitigate and manage the risks of harm. Proportionality will be assessed considering the findings of the regulated service provider’s risk assessments and the size and capacity of the service.
Online providers accessed by children will have a duty to “mitigate the impact of harm to children in different age groups presented by content that is harmful to children”.
In-scope organisations will be required to:
- Mitigate and manage the risks and impact of harm to children in different age groups.
- Prevent children from encountering “primary priority content” and any non-designated content. For example, by using age verification processes such as asking for credit card details.
- Set out in terms and conditions how children are to be prevented or protected from encountering harmful content.
Interestingly, the duty mirrors the approach taken in the Children's Code, created by the Information Commissioner’s Office, which introduced 15 standards to be observed by organisations involved in the processing of children's data.
Regulated service providers will have a duty to protect users from illegal material on their sites, conduct a content risk assessment, and put in place proportionate measures to mitigate and manage the risks of harm from this content. Systems will need to be in place allowing users and affected people to report illegal content.
Legal but harmful content
The sites with the highest number of users will need to conduct 'adult' risk assessments and make clear in their terms and conditions what legal but harmful content is acceptable harmful content. The categories of content that companies’ terms and conditions will need to address will be set out in secondary legislation and be approved by Parliament.
Implications for regulated service providers
The government has estimated the total number of in-scope services in the UK at around 25,000, and once enacted, the legislation will have international reach. This could result in international service providers deliberately blocking UK users as some companies did when General Data Protection Regulation came into effect.
Costs associated with compliance
There will no doubt be regulatory costs associated with compliance for many businesses, and with the risk of large regulatory penalties and criminal liability for failing to comply with the legislation, the upshot may be a more cautious approach to innovation and decrease in investment in the UK.
The Bill is expected to pass in early 2023 with Ofcom’s regulatory powers coming into force two months later. Ofcom has announced that it will provide guidance and codes of practice on how regulated service providers can comply with their duties. Until then, there remains uncertainty for those currently working on their compliance efforts.
Future secondary legislation will flesh out much of the Bill’s application and as such, until published, this creates additional uncertainty for regulated service providers.
Ultimately, the legislation must strike a balance between compliance - making online platforms safer, not least to safeguard children - whilst also promoting business innovation and competition within the UK digital economy and innovation. Where the line will be drawn remains unclear. However, with Ofcom’s Call for Evidence open until 13 September 2022, stakeholders currently have a platform to engage with the regulator.