The House of Commons Women and Equalities Select Committee has issued a report saying that possessing non-consensual intimate images (NCII) should be made an offence, putting it on the same legal footing as child sexual abuse material (CSAM).
Non-consensual intimate image (NCII) abuse occurs when intimate content is produced, published, or reproduced without consent, often online.
NCII abuse can also include material that is considered “culturally intimate” for the victim, such as a Muslim woman being pictured without her hijab. The Committee says that the government should expand the legal definition to include such images.
The OSA creates criminal offences for individuals relating to NCII and places duties on regulated search services and user-to-user services (e.g. social media), including a requirement to take down NCII content. Ofcom also has powers to enforce providers’ compliance with the Act, like imposing fines and applying for service restriction orders. While many platforms remove NCII content voluntarily, some fail to comply with requests to take material down. Around 10% of content remains online, invariably hosted on sites based overseas. The Committees believe that the new regulatory regime overseen by Ofcom is unlikely to have much impact on such sites. This is because Ofcom’s current enforcement powers are too slow and not designed to help individuals get NCII on non-compliant websites taken down. In such circumstances, access to those sites should be blocked. For internet infrastructure providers to take this threat seriously and block access to websites that refuse to comply, NCII should be brought in line with child sexual abuse material (CSAM) in law.
The government should bring forward amendments to the Crime and Policing Bill to make possession of NCII an offence. The Government should also create voluntary guidance for internet infrastructure providers on tackling NCII, like it has for CSAM.
The government should also take a holistic approach to legislating against NCII abuse by introducing a swift and inexpensive statutory civil process, as has been established in other jurisdictions. In addition there should be a registry of NCII content that internet infrastructure providers are requested to prevent access to, similar to the current arrangements for CSAM. The statutory regime should enable civil courts to make orders, including designating an image as NCII content and ordering its inclusion on the registry, as well as requiring an individual to delete any such images.
The government should also set up an Online Safety Commission, like the eSafety Commission in Australia, with a focus on support for individuals. The new Commission would be able to apply for and send such court orders and oversee the NCII registry.
There have been cases where, following the criminal justice process, perpetrators have had devices containing the NCII returned to them, which is harrowing for victims. The Committee says that the government, Sentencing Council and Crown Prosecution Service must each take steps to ensure that those charged with NCII offences are deprived of that material.
The Revenge Porn Helpline has launched a free ‘hashing’ tool designed to protect people from NCII abuse. Hashing generates a digital fingerprint that uniquely identifies an image or video. This is distributed to participating platforms to allow them to prevent that content being uploaded. The Committee expresses disappointment that some major platforms have so far not joined the 13 currently participating platforms and says that they should do so urgently. The Committee welcomes Ofcom’s plans to consult on expansions to its Codes of Practice that would include proposals on the use of hashing.
Synthetic NCII, known also as ‘deepfakes’, refers to any sexual or nude media created using AI that represents the likeness of another individual without their consent. The Government’s plans to criminalise their creation are welcome. However, the Committee says that the offence must be based on the lack of consent of the victim, not motivation of the perpetrator. The creation and use of nudification apps should also be criminalised.
The OSA represents considerable progress in this area, as do the additional offences included in the Crime and Policing Bill and Data (Use and Access) Bill, but significant gaps in the legislative and regulatory framework remain.