UK Increases Child Safety Online
In September 2023, the UK Online Safety Bill, which seeks to increase online safety and security, particularly child safety when using various online platforms, passed its final parliamentary debate. On October 26, the bill received royal assent and the Online Safety Act was signed into law.
The law imposes legal responsibility on information service providers enabling direct information sharing between users (user-to-user services), search engines, social networks, messaging services, video and picture sharing platforms, and other similar services.
The law is likely to apply to user-to-user service providers and to search engines with connections to the UK, according to the following criteria:
- A significant volume of UK users.
- The UK is one of their target markets.
- The content they contain could cause material risk or significant harm to individuals in the UK.
Services and platforms offering various social network aspects (such as allowing user comments, messaging, connecting with “friends,” etc.) may fall within the scope of the law, even if such aspect is secondary to other services provided.
Legal Responsibility
Pursuant to the provisions of the Online Safety Act, regulated companies and services have a legal responsibility for the following:
- To perform content risk assessments, including identification of content that could fall under one of the definitions specified in the law.
- To remove illegal content, including content promoting or exhibiting self-harm, child sexual abuse, controlling or coercive behavior, extreme sexual violence, sexual exploitation, revenge porn, fraud, sales of illegal drugs or weapons, suicide, incitement to violence, illegal immigration and people smuggling, and terrorism.
- To offer users of all ages the option of filtering harmful content such as bullying.
The largest service providers and those posing the highest risk of disseminating dangerous content will be obligated to take additional measures:
- To enforce the content safety promises they made to users in their regulations and terms of use, by removing content that violates their regulations and terms of use, even if such content is not legally or otherwise banned.
- To provide adult users with tools to help reduce the likelihood they will encounter particular types of content (such as content promoting eating disorders, self-harm, racism, antisemitism, misogyny, etc.).
Child Safety
The law obligates information service providers to increase their child safety measures, as follows:
- To prevent children from accessing harmful and age-inappropriate content. Such content is not illegal per se, but is not suitable for children. It includes, inter alia, pornographic content; content that promotes, encourages, or provides instructions for suicide, self-harm, or eating disorders (even if the content does not reach the threshold of a criminal offense); content depicting or encouraging extreme violence; and bullying content.
- To increase their transparency about the risks and dangers posed to children using online services, including by publishing risk assessments.
- To provide parents and children with clear and accessible ways to report various problems that arise when using online services.
Regulated services that fail to comply with the provisions of the Online Safety Act could face fines up to GBP 18 million, or 10% of their global annual revenue, whichever is higher. Furthermore, law enforcement may hold companies and senior managers criminally liable if they fail to comply with the orders to implement measures to protect children and prevent sexual abuse or exploitation issued to them.
Pan-European Courses of Action
The UK Online Safety Act joins several other initiatives regulating child safety when using online services in general and social media and search engines in particular. For example, the European DSA, which comes into full effect in February 2024, requires online platforms to implement transparency and content filtering measures. The UK Children’s Code of 2004 also applies the principles of the UK GDPR to information services used by children and obligates companies to take action to protect children’s rights.
The bill had aroused considerable debate, since it would require companies to waive end-to-end encryption of some services in order to enable content filtering and management in conformity with the law. Additionally, many different service providers would need to implement age-checking mechanisms that, in turn, could result in the collection of considerable personal information about children.
Since most of the law’s provisions will only come into effect after a transitional period, and since some of the law’s regulations have not yet been finalized or published, the full implications for businesses are not yet completely clear. According to the assessments of the UK Office of Communications, which is responsible for enforcement of the law, partial enforcement, which is awaiting parliamentary approval of the regulations, will begin during Q4 2023.
Practical Measures
We recommend that businesses take various measures to prepare for the inception of the UK Online Safety Act.
1. Analyze your company’s online activities to ascertain if the law applies to you:
- Does your company enable any kind of direct user-to-user communications?
- Does your company provide search applications?
- Does your company target the UK market or have a significant volume of UK users?
2. Make sure your company’s terms of use and privacy policy are clear. Consider drafting simple and straightforward versions of these documents that children can easily understand.
3. Perform a data protection impact assessment to identify your platform’s information-processing activities that may affect children and their privacy.
4. Consider if your platform needs to implement an age-checking mechanism.
***
Barnea Jaffa Lande’s Privacy, Data Protection and Cyber Department is at your service to answer any questions about implementing the provisions of the UK Online Safety Act, or about data security and privacy protection, the drafting of compliance programs, and more.
Dr. Avishay Klein is a partner and heads the department.
Adv. Masha Yudashkin is an associate in the department.