Ofcom is seeking evidence about the research that it will need to carry out to prepare its advice to the UK government on categorising regulation services under the Online Safety Bill.
Once passed, the Online Safety Bill will require certain online services such as social media sites, messaging apps and search engines to identify risks to people and have measures in place for protecting them from certain types of harm online. Ofcom will set out guidance and codes of practice on how companies can comply with their duties.
Websites and apps that are in scope will have to protect all their users in the UK from illegal content and, where applicable, protect children from certain online harms. Some services will be categorised as Category 1, 2A or 2B if they meet certain thresholds set out in secondary legislation by the government. These categorised services will be required to comply with additional requirements, including producing transparency reports.
Once the new laws are enacted, Ofcom will be required to carry out research to help advise the government on the thresholds it sets in secondary legislation. Ofcom will then produce a list of categorised services based on these thresholds.
Category 1 and 2B thresholds will be set by reference to user numbers and functionalities. Similarly, Category 2A thresholds will include user numbers.
The call for evidence gives any interested parties the chance to provide information and evidence that Ofcom will consider when carrying out its research. It plans to publish another call for evidence later this year on the duties that will apply to categorised services.
In particular, it seeks information about how companies measure user numbers on the relevant user-to-user parts of their services. Each service is different and that what counts as a user might be different for different services. Ofcom wants to ensure that its approach to categorisation is attuned to this.
The call for evidence ends on 12 September 2023.
In addition, the Irish regulation Coimisiún na Meán is seeking views from the public and other interested parties to inform the development of Ireland’s first binding Online Safety Code. It is intended that the first Online Safety Code will focus on video-sharing platforms service providers and make sure they take measures to address online harms more effectively.
The Commission, which was established in March 2023, has a key responsibility for setting standards, rules and codes for the different types of media services and relevant online services operating in Ireland. This includes responsibility for preparing and applying an Online Safety Code. The Call for Inputs seeks to gather a wide range of views to help develop a code that is fit for purpose and will inform the preparation of the draft code for formal consultation later this year. In preparation for the development of the Online Safety Code, the Commission is also consulting on a new e-Commerce Compliance Strategy. Separately, it intends to establish a Youth Advisory Committee and is conducting research on online harms. Over the coming months, the Commission will designate the video-sharing platforms that will fall within the scope of the Commission’s regulatory framework for online safety. The consultation ends on 16 August 2023.