Ofcom consults on draft guidance: A safer life online for women and girls

February 26, 2025

Ofcom is consulting on draft guidance to protect women and girls under the Online Safety Act.  The guidance sets out areas where technology firms can contribute towards improving women and girls’ online safety by taking responsibility, designing their services to prevent harm and supporting their users.

As SCL readers will be aware, the Online Safety Act 2023 makes platforms, including social media, gaming services, dating apps, discussion forums and search services, legally responsible for protecting people in the UK from illegal content and content harmful to children, including harms that disproportionately affect women and girls. Ofcom has already published its final codes and risk assessment guidance on illegal content and will shortly publish its final codes and guidance about protecting children.

Ofcom is also required to produce guidance setting out how providers can take action against harmful content and activity that disproportionately affects women and girls, in recognition of the unique risks they face. It has now published draft guidance which focuses on the following areas, with a safety by design approach:

  • Online misogyny: content that actively encourages or cements misogynistic ideas or behaviours, including through the normalisation of sexual violence.
  • Pile-ons and online harassment: when a woman or groups of women are targeted with abuse and threats of violence. Women in public life, including journalists and politicians, are often affected (according to Ofcom figures, nearly 75% of female journalists have experienced online threats and abuse).
  • Online domestic abuse: using technology for coercive and controlling behaviour within an intimate relationship.
  • Intimate image abuse: the non-consensual sharing of intimate images, including those created with AI; as well as cyberflashing – sending explicit images to someone without their consent. 

The draft guidance identifies nine areas where technology firms should improve women and girls’ online safety under the three categories: by taking responsibility, designing their services to prevent harm and supporting their users:

  • “Abusability” testing to identify how a service or feature could be exploited by a malicious user;
  • Technology to prevent intimate image abuse, such as identifying and removing non-consensual images based on databases;
  • User prompts asking people to reconsider before posting harmful material, including detected misogyny, nudity or content depicting illegal gendered abuse and violence;
  • Easier account controls, such as bundling default settings to make it easier for women experiencing pile-ons to protect their accounts;
  • Visibility settings, allowing users to delete or change the visibility of their content, including material they uploaded in the past;
  • Strengthening account security, for example using more authentication steps, making it harder for perpetrators to monitor accounts without the owner’s consent;
  • Removing geolocation by default, because if it leaks it can lead to serious harms, stalking or threats to life;
  • Training moderation teams to deal with online domestic abuse;
  • Reporting tools that are accessible and support users who experience harm;
  • User surveys to better understand people’s preferences and experiences of risk, and how best to support them; and
  • More transparency, including publishing information about the prevalence of different forms of harms, user reporting and outcomes.

The consultation ends on 23 May 2025.