Ofcom has published its draft Children’s Safety Codes of Practice, which sets out how it expects online services to meet their legal responsibilities to protect children online. This is the second key consultation under the Online Safety Act and includes Ofcom’s proposals for how internet services that enable the sharing of user-generated content (user-to-user services) and search services should approach their new duties relating to content that is harmful to children. There are two separate codes for user-to-user and for search services. Together, they include around 40 measures that companies can take to protect children.
Ofcom has covered:
- how to assess if a service is likely to be accessed by children;
- the causes and impacts of harms to children; and
- how services should assess and mitigate the risks of harms to children.
Services must carry out robust age-checks to stop children accessing harmful content
The draft Codes expect much greater use of highly-effective age-assurance so that services know which of their users are children to keep them safe.
Ofcom says that in practice, this means that all services which do not ban harmful content, and those at higher risk of it being shared on their service, will be expected to implement highly effective age-checks to prevent children from seeing it. This may mean preventing children from accessing the entire website or app. Or it might mean age-restricting parts of their site or app for adults-only access, or restricting children’s access to identified harmful content.
Methods of age assurance which would be considered highly-effective include: open banking, matching photo ID, facial age estimation, credit card checks, reusable digital ID services and mobile network operator age checks. Ofcom excludes methods such as self-declaration, age verification via payment methods which do not require a user to be over 18, such as using debit cards, and general contractual restrictions.
Ensure that algorithms which recommend content do not operate in a way that harms children
Ofcom says that recommender systems (algorithms which provide personalised recommendations to users) are children’s main pathway to harm online. They may serve up solicited, dangerous content to children in their personalised news feeds. As a consequence, Ofcom is proposing that any service which operates a recommender system and is at higher risk of harmful content must also use highly-effective age assurance to identify who their child users are. They must then configure their algorithms to filter out the most harmful content from these children’s feeds and reduce the visibility and prominence of other harmful content. Children must also be able to provide negative feedback directly to the recommender feed, so it can better learn what content they don’t want to see.
Introduce better moderation of content harmful to children.
Evidence shows that content harmful to children is available on many services at scale, which suggests that services’ current efforts to moderate harmful content are insufficient. Under the draft Codes, all user-to-user services must have content moderation systems and processes that ensure swift action is taken against content harmful to children. Search engines are expected to take similar action; and where a user is believed to be a child, large search services must implement a “safe search” setting which cannot be turned off and must filter out the most harmful content.
What else is Ofcom proposing?
Other broader measures require clear policies from services on what kind of content is allowed, how content is prioritised for review, and for content moderation teams to be well-resourced and trained. Ofcom is going to consult later this year on how automated tools, including AI, can be used to proactively detect illegal content and content most harmful to children, including previously undetected child sexual abuse material and content encouraging suicide and self-harm.
Stronger senior accountability and support for children and parents
The draft Codes also aim to ensure strong governance and accountability for children’s safety within tech firms. These include having a named person accountable for compliance with the children’s safety duties; an annual senior-body review of all risk management activities relating to children’s safety; and an employee Code of Conduct that sets standards for employees around protecting children.
A range of other proposed safety measures focus on providing more choice and support for children and the adults who care for them. These include having clear and accessible terms of service, and making sure that children can easily report content and make complaints.
Support tools should also be provided to give children more control over their interactions online – such as an option to decline group invites, block and mute user accounts, or disable comments on their own posts.
Risk assessment guidance
Ofcom has also published draft Children’s Access Assessment Guidance and Children’s Risk Assessment Guidance to help services with complying with the Online Safety Act.
Ofcom points out that companies may already have assessed whether a service is likely to be accessed by children as set out in the ICO’s Children’s code to comply with data protection laws. It highlights that the requirements of data protection law are different, and companies will need to carry out a separate children’s access assessment, although they may be able to draw on similar evidence and analysis for both. Last week, the ICO and Ofcom issued a joint statement about their collaboration on the regulation of online services where online safety and data protection intersect.
The purpose of the children’s risk assessment is to improve a company’s understanding of the risk of harm to children on its service and what safety measures a company needs to put in place to protect them. If a service or part if it is likely to be accessed by children, a children’s risk assessment must be completed to comply with the Online Safety Act.
What happens next?
The consultation ends on 17 July 2024. Ofcom expects to publish its final Childrens Safety Codes of Practice within a year. Once approved by Parliament, the Codes will come into effect and Ofcom will begin enforcing the regime.