The House of Lords Select Communications and Digital Committee has issued a report on online freedom of speech. It considers how competition between platforms can be increased to benefit freedom of expression; how the government should ensure that illegal content is removed and that legitimate opinions are not censored; and how the design of platforms and initiatives to promote digital citizenship can support inclusive and civil online environments for the exchange of information, ideas and opinions. The Committee has made recommendations to the government on how it should regulate platforms and to the industry on how it can change.
The Committee highlights that freedom of speech is a hallmark of free societies, but it is not an unfettered right. The internet, and particularly social media, provides individuals with an unprecedented ability to share their views. The Committee says that it welcomes this and seeks to strengthen freedom of expression online. However, it says that the sector has been monopolised by a small number of private companies, which are often based outside the UK and whose primary aim is to profit from their users’ data. They are free to ban or censor whoever and whatever they wish, as well as to design their platforms to encourage and amplify certain types of content over others.
In recent years, the harms users can suffer online have received growing attention. The Committee supports the government’s proposal that, through the proposed draft Online Safety Bill, platforms should be obliged to remove illegal content. Ofcom should hold them to strict timeframes where content is clearly illegal. The Committee says that it supports the government’s intention to protect children from harm, although it says that the draft Bill is inadequate in this respect, particularly in relation to pornographic websites. Nor is the committee convinced that the draft Bill sufficiently protects vulnerable adults. It says that the Bill’s proposed duties should be complemented by an increase in resources for the police to allow them effectively to enforce the law, including on harassment, death threats, incitement, stirring up hatred, and extreme pornography. The Committee says that platforms should contribute to this increase in resources.
The Committee disagrees with the government’s proposal to introduce duties on platforms in relation to content which is legal but may be harmful to adults. It says that if the government believes that a type of content is sufficiently harmful, it should be criminalised. It gives the example of the racist abuse directed at members of the men’s England football team and says any such abuse which is not already illegal should be.
It goes on to say that content which is legal but some may find objectionable should instead be addressed through regulation of the design of platforms, digital citizenship education, and competition regulation. According to the Committee, this approach would be more effective, as well as better protecting freedom of expression.
Furthermore, the Committee points out that social media platforms do not simply provide users with a neutral means of communicating with one another. The way they are designed shapes what users see, what they say, and how they interact. Platforms’ business models drive them to design services to maximise the time users spend on them, even if this means encouraging their worst instincts. This includes providing incentives to post outrageous content, the reach of which is then amplified by platforms’ algorithms. The Committee calls for the Online Safety Bill to include a robust duty to ensure that powerful platforms make responsible design choices and put users in control over what content they are shown by giving them an accessible and easy-to-use toolkit of settings, including through third-party applications.
The Committee adds that design changes must be complemented by digital citizenship education, both through schools and public information campaigns, to make clear the moral duties to others which come with the right to express oneself freely online and the consequences which the abuse of that right can have, as well as the dangers of retreating into social media echo chambers. Ultimately, says the Committee, people must be given a real choice about the platforms they use.
Finally, the Committee says that the dominance of companies such as Facebook and Google means that they have little incentive to put their users’ interests first, as users have little choice to go elsewhere. They have become like utilities. Tougher competition regulation is long overdue. The government must urgently give the Digital Markets Unit the powers it needs to “end the stranglehold of the big technology companies”. The rights and preferences of individual people must be at the heart of a new, joined-up regulatory approach, bringing together competition policy, data, design, law enforcement, and the protection of children.