Ofcom has published its first consultation under the Online Safety Act 2023. It focuses on its proposals for how internet services that enable the sharing of user-generated content (user-to-user services) and search services should approach their new duties relating to illegal content. It covers:
- the causes and impacts of illegal harms;
- how services should assess and mitigate the risks of illegal harms;
- how services can identify illegal content; and
- Ofcom’s approach to enforcement.
Causes and impacts of illegal harms
Ofcom is consulting on its assessment of how the priority illegal harms covered by the Act manifest online, what factors give rise to risk of these harms and what their impact is. It says that this analysis emphasises the need for services to take action to combat online harms. it shows that a large proportion of the UK population has experienced harm online and that the impact of online harms can, in cases, be extremely sever. It says that its assessment demonstrates that women, children and groups with protected characteristics are especially likely to be exposed to harm online.
Assessing risk
Ofcom is consulting on its proposed guidance about how regulated services should assess the risk of illegal harm taking place on their services as well as on proposals about the governance services should put in place to manage the risks. It also asks for views on guidance about how services should keep adequate records of their risk assessment.
Services should undertake a robust and comprehensive risk assessment. Ofcom has proposed a four-step process. These four steps are:
- understand the harms that need to be assessed;
- assess risks by considering the likelihood and potential impact of harms occurring on their service;
- implement safety measures and record outcomes of the risk assessment; and
- report, review and update the risk assessment.
It is also proposing that services take several steps to ensure that they have strong governance procedures in place to mitigate the risks associated with illegal content. For example, it is proposing that senior governance bodies at large services review the service’s risk management activities related to online safety at least annually and that all services identify a named senior executive who is accountable for compliance with the online safety duties.
Mitigating risk
It is also consulting on its illegal harms Code of Practice, setting out recommended measures that regulated services can take to mitigate the risk of illegal harm. While the Codes are not binding, and so services can choose to take a different approach to meeting their duties, they act as a ‘safe harbour’. This means any service that implements the recommendations in the Codes would be deemed to be compliant with its related safety duties. Ofcom recommends that services put in place a series of measures which, taken together, will help combat the priority illegal harms in scope of the Act. These measures include:
- ensuring content moderation teams are appropriately resourced and trained;
- having easy-to-use systems for users to report potentially illegal content and make complaints;
- allowing users to block other users or disable comments;
- conducting tests when they update their algorithms that recommend content to users (‘recommender systems’) to assess the risk that the changes would increase the dissemination of illegal content; and
- a series of recommended steps to make their terms and conditions clear and accessible.
Ofcom also proposes that relevant services should take a series of targeted steps to combat Child Sexual Exploitation and Abuse (CSEA), fraud and terrorism. These targeted steps include:
- Using a technology called “hash matching” to detect and remove known CSAM (Child Sexual Abuse Material). This does not apply to private communications end-to-end encrypted communications. Ofcom is not proposing measures which would involve breaking encryption. However, end-to-end encrypted services are still subject to all the safety duties set out in the Act and will still need to take steps to mitigate risks of CSAM on their services.
- Taking steps to make it harder for perpetrators to groom children online. For example, configuring default settings so that children do not appear in lists suggesting other users connect with them.
- Deploying keyword detection systems to help find and remove posts linked to the sale of stolen credentials. Ofcom is also recommending large and high-risk services have dedicated fraud reporting channels.
- Where services operate account verification schemes, they should be transparent about the steps they are taking to verify accounts. This is aimed at reducing the risk of users being deceived by posts on fake accounts and should help address fraud and foreign interference in UK processes such as elections.
- Blocking accounts run by banned terrorist organisations.
Some of these proposals would apply to all services. However, it says that it is only targeting the more onerous proposals at services which are large and/or high risk.
Ofcom’s proposals also aim to combat violence against women and girls online, including by setting out how services should assess their risk of coercive and controlling behaviour, stalking, harassment and threats, and intimate image abuse. Ofcom recognises that more work is needed in this area, and it will be publishing draft guidance on how services can combat violence against women and girls in early 2025 as part of a later phase of work related to the Act.
Identifying illegal content
A new legal requirement of the Act is for all services to swiftly take down specific illegal content when they become aware of it. Therefore, Ofcom is consulting on its Illegal Content Judgements Guidance. This will provide guidance to services on how they can identify whether a piece of content is likely to be illegal.
Enforcement
The Act gives Ofcom significant enforcement powers if organisations do not comply with it, including the ability to issue fines of up to £18m or up to 10% of the service’s qualifying worldwide revenue (whichever is greater) and to apply for a court order requiring an internet service provider to withdraw access to the service to prevent a significant risk of harm to UK users due to its failure. Ofcom’s guidance on enforcement sets out its approach.
Next steps
Posted in Technology, Society & Justice