The regulation — or, at least, attempted regulation — of online pornography, and initiatives under the banner of protecting children online, is nothing new. Over recent years, we have seen legislative interventions (including the regulation of ‘extreme pornography’ under the Criminal Justice and Immigration Act 2008 and the Audiovisual Media Services Regulations 2014), co-regulatory approaches such as the Internet Watch Foundation’s ‘CAIC’ list and self-regulatory measures (albeit underpinned in some cases by not-inconsiderable political pressure) by Internet access providers to implement ‘family friendly’ filters.
Part 3 of the Digital Economy Bill 2016 is, as currently drafted, a further attempt in this area, setting out a framework to require operators of commercial online pornography services to implement age verification.
The general obligation
The main thrust, if I can put it that way, of this part of the Bill is that a person who ‘makes pornographic material available on the internet on a commercial basis to persons in the United Kingdom’ is prohibited from doing so ‘except in a way that secures that, at any given time, the material is not normally accessible by persons under the age of 18.’ In other words, the service must have at least reasonably robust age verification mechanisms, or else not provide content to UK users.
The framework is to cover both websites and ‘other means of accessing the internet’ — a slightly nebulous term, which is likely to mean ‘apps’, given that applications are expressly identified in the government’s response to its February 2016 consultation on age verification.
Internet access providers are likely to feel left in an uncertain position at the moment as, while the Bill does not reference them in this context, the definition of ‘makes pornographic material available’ could be argued as incorporating companies which provide connectivity to servers used for the making available of pornographic material. They may be able to take some solace from paragraph 22 of the Explanatory Notes, which refers to ‘commercial providers of pornography’, and so appears to place an emphasis on the content provision, but the optimal approach would be to improve the drafting in this section to make the legislative intent clear.
‘Operating on a commercial basis’
There is no definition within the Bill of ‘on a commercial basis’, although the provision of pornography ‘free of charge’, on a service which is otherwise ‘operated on a commercial basis’ is expressly stated to fall within the rules (clause 15(2)).
This approach would appear to be consistent with that taken by the CJEU in considering the scope of the ‘hosting’ shield for the purposes of Directive 2000/31/EC, in respect of which it has held that a site’s economic context need not relate solely to revenue flowing from the site’s visitors, but can also include revenue from a site’s advertising deals (Papasavvas). Similarly, in the recent decision of McFadden, the CJEU held that making available a free wi-fi service was a ‘service normally provided for remuneration’ where it is an activity performed by the service provider for the purposes of advertising the goods sold or services supplied by that service provider: the Court looked broadly at the purpose of the wi-fi service provision, rather than narrowly at the specific service itself.
A site which provided pornography free of charge and without any third-party advertising revenue may, on such a basis, still be considered to be ‘operating on a commercial basis’ if the site is operated with the intention of ‘upgrading’ users to the operator’s premium, for-charge, content.
Further, a broad degree of discretion is to be afforded to the regulator, which will seemingly be empowered to decide when a site is, or is not, complying with the rules, and whether or not it is operating on a commercial basis (clause 15(3)). While affording a regulator these powers may allow for a nimbleness of movement which a statute may struggle to provide, it also entails giving an appointed body potentially considerable power over a business’s ability to operate.
Definition of ‘pornographic material’
The definition of ‘pornographic material’ within the Bill takes up the best part of half a page; considerably more than the definition used in the Criminal Justice and Immigration Act 2008, which took a little over a line.
The gist of the definition is that ‘pornographic material’ encompasses all video works, or excerpts of video works, which are, or would be, ‘R18’ material, or would have led to an 18 certificate (clause 16).
On first glance, there would appear to be a lacuna in this as it would seem to exclude material which would not be eligible for R18 but which would not fall within offences of obscenity or extreme pornography — this may include, for example, a portrayal of apparently non-consensual activity. As this type of activity is likely to be exactly that in respect of which measures of this nature are considered necessary, this would appear to be an unfortunate oversight.
Age verification and proof of identity
The Bill does not specify what would constitute suitable age-verification measures, but the government’s consultation response is clear that a ‘tick-box’ or ‘enter your date of birth’ verification will not be deemed sufficient.
As such, some form of third-party validation or data sharing is likely to be compulsory, and it will be interesting to see whether this can be achieved without requiring users to identify themselves, in terms of providing their real world identity, to site operators.
If the only mechanism by which a site can adduce appropriate age verification mechanisms would be to require proof of identity, this is likely to have a significant adverse impact on most sites’ business models, as I would be surprised if many visitors would be willing to provide this information. I suspect that many users value their apparent (if not actual) anonymity when visiting a pornography site. If this is indeed the effect, and if it is not imposed and enforced across all providers, domestic and foreign, UK-based providers are likely to be at a considerable disadvantage.
Overseen by a new regulator
To oversee this new framework, the government is proposing to create a new regulator, the ‘age-verification regulator’ (clause 17). This new regulator would be imbued with information gathering powers (clause 19), as well as enforcement powers (clauses 20 and 21).
Enforcement powers include both enforcement notices (although these appear to be entirely optional, and not a prerequisite of issuing a financial sanction (clause 20(4)(a)), and financial sanctions. These financial sanctions have a maximum penalty of the greater of £250,000, or 5% of qualifying turnover (clause 21).
Enforcement is undertaken on a civil, not criminal basis. Although my gut reaction is that this is disappointing, since it entails a lesser burden of proof and it affords a substantial degree of power to a non-judicial body, it is perhaps to the general benefit of those against which sanctions might be levied to face a merely financial, rather than also criminal, risk.
Payment providers, ISPs and others
There is no power within the Bill to require an Internet access provider to block access to a service which fails to comply with the age-verification obligations. According to the Explanatory Notes, at paragraph 23, such power is not required ‘on the basis that this would not be consistent with the treatment of other harmful or illegal content such as online terrorist material’. The Notes also appear to cite with approval existing, co-regulatory approaches, such as deployment by major ISPs of ‘optional family friendly filters’.
Net neutrality and the blocking of porn sites
The government’s seeming approval of the voluntary ‘family friendly filtering’ approach adopted by many Internet access providers is interesting, particularly against the backdrop of the open Internet directive (2015/201/EC).
Overseas effect and general effectiveness?
The elephant in the room with this legislation — and any legislation of a similar character — is the extent to which it will apply to providers based outside the UK and, more broadly, whether it will have much practical effect, particularly without the fall-back mechanism of compulsory site blocking.