The Report reaches conclusions and recommendations on:
• risks from the Internet, particularly social networking sites and sites hosting user-generated content;
• ways of protecting Internet users from potentially harmful content;
• safe use of social networking sites; and
• classification of video games.
Laurence Eastham writes:
The report summary is reproduced below and, although its opening paragraph suggests the sort of approach for which politicians are notorious (ie over-reacting to media hype), its message is a sensible one. There is an essential illogicality arising from the fact that there is a potential liability where user-generated content is monitored but freedom from liability where it is not. Lawyers tend to shrug their shoulders and accept that illogicality, but it is not too surprising that MPs want to change it.
It is hard too to argue with the view that increasing awareness of the privacy and safety implications of posting information online, especially on social networking sites, is a Good Thing.
This is an area where imposed regulation and/or legislation may be sought, notwithstanding the stated preference for self-regulation and the dependence on change at an EU level. The political difficulties in arguing with the conclusions of the Byron report (on the basis that ‘Nanny knows best’ and for fear that one is thought not to care about child welfare) mean that any proposal for regulation is hard to resist. That Brussels factor leads me to wonder if reports such as this might not be more valuable if there was at least some input from Euro-MPs, even if actual membership of the Committee is a constitutional impossibility.
For the Government consultation on games classification, click here.
Report Summary
More and more, the Internet is becoming a part of our lives. For communications, research and commerce, it is now an indispensable tool. However, anyone who regularly watches television or reads the press is likely to have become aware of growing public concern in recent months at the Internet’s dark side, where hardcore pornography and videos of fights, bullying or alleged rape can be found, as can websites promoting extreme diets, self-harm, and even suicide.
There is particular anxiety about the use of social networking sites and chatrooms for grooming and sexual predation. Although these environments may appear to a child to be relatively private, with defined boundaries, in fact a user’s profile or an online forum may be open to thousands or even millions of other users, all able to view even if they do not actively participate. Unless a user of a networking site takes steps to control access to their territory, he or she is potentially exposed to malicious communication or comment and invasions of privacy.
There is heated debate about whether certain types of content cause harm, particularly to children, and whether there is a direct link between exposure to violent content on the Internet or in video games and subsequent violent behaviour. The conclusion overall is that there is still no clear evidence of a causal link; but incontrovertible evidence of harm is not necessarily required in order to justify a restriction of access to certain types of content in any medium, and we conclude that any approach to the protection of children from online dangers should be based on the probability of risk.
We welcome the analysis by Dr Byron of the risks posed by the Internet to children and agree with her conclusion that a UK Council for Child Internet Safety should be established. We are concerned at reports from some key players that there has been little opportunity to influence decisions as to how the Council will operate in practice.
Sites which host user-generated content—typically photos and videos uploaded by members of the public—have taken some steps to set minimum standards for that content. They could and should do more. We recommend that terms and conditions which guide consumers on the types of content which are acceptable on a site should be prominent. It should be made more difficult for users to avoid seeing and reading the conditions of use: it would then become more difficult for users to claim ignorance of terms and conditions if they upload inappropriate content.
It is not standard practice for staff employed by social networking sites or video-sharing sites to preview content before it can be viewed by consumers. Some firms do not even undertake routine review of material uploaded, claiming that the volumes involved make it impractical. We were not persuaded by this argument, and we recommend that proactive review of content should be standard practice for sites hosting user-generated content. We look to the proposed UK Council to give a high priority to reconciling the conflicting claims about the practicality and effectiveness of using staff and technological tools to screen and take down material. We also invite the Council to help develop agreed standards across the Internet industry on take-down times—to be widely publicised—in order to increase consumer confidence.
It is common for social networking sites and sites hosting user-generated content to provide facilities to report abuse or unwelcome behaviour; but few provide a direct reporting facility to law enforcement agencies. We believe that high profile facilities with simple, preferably one-click mechanisms for reporting directly to law enforcement and support organisations are an essential feature of a safe networking site. We would expect providers of all Internet services based upon user participation to move towards these standards without delay.
The designation of some of the more extreme types of material as illegal has had a beneficial effect in restricting the amount of harmful content hosted in the UK and in limiting access to harmful content hosted abroad. We do, however, believe that the UK Council for Child Internet Safety should discuss with the Ministry of Justice whether the law on assisted suicide is clear enough to enable not just successful prosecutions but also action to block access to websites which assist or encourage suicide.
As things stand, companies in the Internet industry largely regulate themselves. We believe that self-regulation has a range of strengths: a self-regulating industry is better placed to respond quickly to new services; it is more likely to secure “buy in” to principles; and it will bear the immediate cost. We accept that significant progress has been achieved through self-regulation by the various industries offering Internet-based services, but there appears to be a lack of consistency and transparency of practice, and the public needs the assurance that certain basic standards will be met. Rather than leap to statutory regulation, we propose a tighter form of self-regulation, under which the industry would speedily establish a self-regulatory body to draw up agreed minimum standards based upon the recommendations of the UK Council for Child Internet Safety, monitor their effectiveness, publish performance statistics, and adjudicate on complaints. In time, the new body might also take on the task of setting rules governing practice in other areas such as online piracy and peer to peer file-sharing, and targeted or so-called “behavioural” advertising.
Several Government departments have an interest in this field, and it does seem that there is scope for improved co-ordination of activity between them. A single Minister should have responsibility for co-ordinating the Government’s effort in improving levels of protection from harm from the Internet, overseeing complementary initiatives led by different Government departments, and monitoring the resourcing of relevant Government-funded bodies.
There is a distinct issue about labelling of video games to indicate the nature of their content. Two systems currently exist side by side: the industry awards its own ratings, and the British Board of Film Classification awards classifications to a small number of games which feature content unsuitable for children. The dual system is confusing, and Dr Byron recommended that there should instead be a single hybrid system. We believe that Dr Byron’s solution may not command confidence in the games industry and would not provide significantly greater clarity for consumers. While either of the systems operated by the BBFC and by the industry would be workable in principle, we believe that the widespread recognition of the BBFC’s classification categories and their statutory backing offer significant advantages which the industry’s system lacks. We therefore agree that the BBFC should have responsibility for rating games with content appropriate for adults or teenagers, as proposed by Dr Byron, and that these ratings should appear prominently. Distributors would of course be free to continue to use industry ratings in addition.