UK: Public Accounts Committee issues report on preparedness for online safety regulation
The Public Accounts Committee has issued a report on Ofcom’s preparedness for online safety regulation. It says that Ofcom has made a good start in preparing for its new role as the online safety regulator. It benefited from time to prepare while the Online Safety Bill was going through Parliament and so has been able to move swiftly since the Act became law. Ofcom has already acted against a suicide-promoting website, which is now blocked in the UK. However, the Committee says that it may be years until people notice a difference to the online experience. People may be further disappointed that Ofcom cannot act on individual complaints from the public and does not plan to inform complainants about any resulting action it takes where their complaints have helped it to identify a systemic issue with a service provider. Ofcom faces significant challenges about how it will engage with, supervise and regulate providers based overseas (which constitute the vast majority of regulated services), in particular smaller providers and those that may seek to avoid its attention. The Committee points out that over 100,000 service providers are covered by the Act and so Ofcom is reliant on large scale data collection and automated systems to regulate them all, which it has yet to develop. These systems will have to keep up with the fast-moving nature of online harms. The Committee says that this regulatory regime is at the forefront of online regulation globally. If Ofcom follows through on its positive start, then the establishment of the online safety regime has the potential to be a case example of good practice when setting up a new regulator, or significantly expanding its remit. However, Ofcom still has a lot to do to implement an effective regulatory regime and some of this work will take a long time. A key measure of success for the new regime will be whether Ofcom is able to meet the requirement in the Act to have regulation in place for illegal harms and protecting children by April 2025. The government has two months to respond to the report.
UK: Ofcom consults on changes to digital television additional service licences
Ofcom is consulting on proposed changes to the conditions included in Digital Television Additional Service licences. These services are broadcast on Freeview and usually consist of text or data – for example, they are used to broadcast software that allows a viewer to watch channels delivered via the internet. Under the current licence conditions for these services, a warning must be displayed letting viewers know they are about to view material delivered over the internet, which may not be regulated in the same way as other television services. However, the current wording of the licence condition means that a warning must be displayed even if the service is licensed by Ofcom and therefore subject to its content standards rules. This could be confusing for viewers, so Ofcom is proposing to update the licence condition so that warnings are not required if the licensee holds an Ofcom broadcast licence. Ofcom also wishes to introduce some administrative changes. The consultation ends on 17 April 2024.
EU: European Commission opens formal DSA breach proceedings against TikTok
The European Commission has opened formal proceedings to assess whether TikTok may have breached the Digital Services Act (DSA) in areas linked to the protection of minors, advertising transparency, data access for researchers, as well as the risk management of addictive design and harmful content. Based on the preliminary investigation conducted so far, including an analysis of the risk assessment report sent by TikTok in September 2023, as well as TikTok’s replies to the Commission’s formal Requests for Information (on illegal content, protection of minors, and data access), the Commission has decided to open formal proceedings against TikTok under the Digital Services Act. The Commission will continue to gather evidence, for example by sending additional requests for information, conducting interviews or inspections. The opening of formal proceedings empowers the Commission to take further enforcement steps, such as interim measures, and non-compliance decisions. The Commission is also empowered to accept any commitment made by TikTok to remedy on the matters subject to the proceeding. The DSA does not set any legal deadline for bringing formal proceedings to an end. The duration of an in-depth investigation depends on several factors, including the complexity of the case, the extent to which the company concerned cooperates with the Commission and the exercise of the rights of defence.
EU: Application by Bytedance seeking suspension of European Commission gatekeeper designation decision dismissed
In Bytedance v Commission (Case T-1077/23 R), the General Court dismissed an application by Bytedance for interim measures in its appeal against the European Commission’s September 2023 decision designating it a “gatekeeper” under Article 3 of the Digital Markets Act. According to the General Court, Bytedance had not shown that it is necessary to suspend the contested decision until the proceedings on the substance of the case are closed to avoid serious and irreparable harm to Bytedance. Bytedance argued that the immediate implementation of the contested decision risks causing the disclosure of highly strategic information concerning TikTok’s user profiling practices, which is not otherwise in the public domain. That disclosure would enable TikTok’s competitors and other third parties to obtain insight into TikTok’s business strategies in a way that would significantly harm its business. The court rejected this.