Ofcom has published two consultations under the Online Safety Act 2023, one on transparency reporting and the other in information gathering.
Transparency reporting
Under the Act, “categorised services”, which are expected to include some of the most widely used social media and search services, are required to produce transparency reports at least annually. Ofcom is now consulting on draft industry guidance setting out its proposed approach to issuing ‘transparency notices’ to applicable online services. Starting in 2025, Ofcom will issue these notices, subject to secondary legislation being passed. They will set out the detailed safety information that providers must disclose in their transparency reports, the format it should take, and applicable deadlines for making it public. The information required will differ from platform to platform, taking account of the type of service, its number of users, the proportion who are children, along with certain other factors. It might include information such as how prevalent illegal content is on their service, how many users have come across such content, and the effectiveness of features used by a platform to protect children.
Ofcom is also planning to publish its own summary reports. It hopes this will improve safety outcomes for users. First, people will be able to judge whether firms are doing enough to make their platforms safe and how different services compare. It says that this will help people to make informed decisions about using apps and sites. Second, by showing what goes on within popular sites and apps, Ofcom hopes that firms will improve their safety standards.
Information gathering
Under the Act, Ofcom can also access information held by regulated tech firms, as well as a wide range of third parties. This aims to help Ofcom to understand the effectiveness of the safety measures tech firms have in place and to gather evidence if it has specific compliance concerns. It is now consulting on its draft guidance for industry on its general approach to its online safety information gathering powers and duties to comply. It covers the wide range of circumstances in which it might use these powers, including for example to:
- carry out an audit of a tech firm’s safety measures or features;
- remotely inspect the workings of their algorithms in real time;
- obtain information to allow Ofcom to respond to a Coroner’s request if a child dies; and
- in exceptional cases, enter UK premises of tech companies to access information and examine equipment.
Firms can face enforcement action or, in the most serious cases, criminal liability for failure to respond to information notices in an accurate, complete and timely way.
Next steps
Failure to comply with either a transparency or information notice from Ofcom could result in tech companies facing fines of up to £18m or 10% of a company’s worldwide revenue – whichever is higher. It recently fined TikTok, under the separate rules governing video-sharing platforms, for failing to respond to a request for information about its parental controls.
The consultations end on 4 October 2024. Ofcom expects to publish its final guidance by early 2025.