UK law
Ofcom publishes letter to online service providers on generative AI and chatbots
Ofcom has published an open letter to online service providers in the UK about how the Online Safety Act 2023 applies to generative AI and chatbots. Websites or apps which use a generative AI chatbot that enables users to share text, images or videos generated by the chatbot with other users, as well as sites or apps which allows users to upload or create their own generative AI chatbots which are also made available to other users are “user-to-user services”, and so come within the scope of the Act. Any AI-generated text, audio, images or videos shared by users on a user-to-user service is considered “user-generated content” and is regulated like human-generated content. Generative AI tools that enable the search of more than one website or database come under the Act as they are search services. In addition, websites and apps that include generative AI tools that can generate pornographic material are also regulated under the OSA 2023. Ofcom also highlights the responsibilities of online service providers, including the requirements to undertake risk assessments about the likelihood of users encountering harmful content, to implement proportionate measures to mitigate and manage those risks, and to allow users to easily report illegal posts and material that is harmful to children.
Consultation on updating the media mergers regime
The UK’s communications regulator, Ofcom, has a duty under section 3 of the Communications Act 2003 to secure and maintain a sufficient plurality of providers of different TV and radio services. It also has a duty under section 391 of the Act to review the operation of the media ownership rules listed in that section every three years. In its most recent review of these rules, Ofcom set out three recommendations for government: retain the Cross-Media Ownership and Appointed News Provider rules relating to Channel 3; remove restrictions on certain entities holding broadcasting licences; broaden the scope of the existing Media Public Interest Test framework beyond print newspapers and broadcasters by applying existing public interest considerations in section 58 of the Enterprise Act 2002 to a broader range of news creators. It has now launched a technical consultation which will focus on the final recommendation and is seeking views on proposed government amendments to the EA 2002 that will implement these changes. The consultation ends on 18 December 2024.
DBT publishes study on prevalence and potential harm of defaults in online shopping
The DBT has published a report about online choice architecture. It sets out how consumer decisions are influenced by the design of digital shopping environments. Overall, it found that defaults are prevalent across e-commerce sites in the UK, and they can potentially have a large impact on consumers. However, the DBT does not think that they are, in general, misleading consumers. Therefore, it is not calling for immediate policy intervention. However, given the potential for misuse, the DBT recommends a forward-looking approach to policy to monitor their impact, while guiding retailers towards a transparent and consumer-friendly deployment of defaults. The DBT has developed key policy recommendations. Establishing guidelines, including best practice, for the use of defaults may help prevent the practice being used in a way that causes harm to consumers. Such standards should account for the differential impact that defaults may have on groups that are more vulnerable. It also recommends consumer awareness and guidance. Certain groups may be more susceptible to the impact of defaults (e.g. older adults). These, and any other vulnerable groups may need to be targeted through consumer education and awareness initiatives. The third recommendation is fostering consumer-centric innovation. The resources being invested into online choice architecture by online retailers should focus on ensuring the practice benefits consumers. This could be achieved by collaborating with the government to ensure the adoption of practices that are fair and protect vulnerable users. For instance, data suggests that guiding retailers to make defaults more noticeable to consumers can reduce their impact. Such interventions can be low-cost, easy to implement and effective.
NCSC issues guidance for brands to help advertising partners counter malvertising
Digital advertising is fundamental to the digital economy and depends on the interactions between those selling advertising space and those buying it, often in real time. But this can be abused and result in malicious advertising, or malvertising, which can include malware. This can lead to fraud and undermines trust in the digital advertising industry.
However, by putting in place defence-in-depth measures, digital advertising partners can help reduce the presence of malvertising. These actions are transparent to the user and are consistent with the NCSC principles of secure by design, as each measure provides a layer of security which when deployed collectively, reduces harm to the end user. The NCSC has set out the actions that brands should expect of their digital advertising partners, in the form of principles.
CMA publishes ninth update report on Google’s implementation of the “Privacy Sandbox” commitments
On 11 November 2024, the CMA published its latest current views on the Privacy Sandbox tools. In addition, it incorporates the Monitoring Trustee’s assessment of Google’s compliance with the relevant provisions of the commitments. The CMA also published Google’s progress report on its compliance with the binding commitments accepted by the CMA. The report covers the period from 1 April 2024 to 30 September 2024.
Tech execs to face sanctions for failing to curb knife content
To combat the unacceptable use of social media and online marketplaces to market illegal weapons and glorify violence, senior executives of social media companies will face personal fines if they fail to remove illegal content swiftly. The government has published a consultation including proposals to give police the power to issue notices to senior executives of online companies, ordering them to remove specific pieces of content, potentially within two days. If the company fails to act on this, the police will send a second notice to the senior executive in that company, who would then be personally liable for a significant fine if they too fail to act. The consultation ends on 11 December 2024.
CMA launches dynamic pricing project
The Competition and Markets Authority has opened a project to consider how dynamic pricing is being used across different sectors of the economy. The project will gather views from businesses in various industries, including travel and leisure, that are using pricing practices that may be considered dynamic pricing. It will also engage with consumer groups and other regulators to obtain their views. The project will consider different scenarios where dynamic pricing strategies are being used; commercial and consumer benefits of dynamic pricing strategies; and whether dynamic pricing strategies create challenges for consumers and competition. The project will publish an update setting out its findings. The CMA’s work will help to inform the UK government’s consideration of the issues raised by dynamic pricing, for example, as part of its upcoming call for evidence on price transparency in the live events sector. The project is not a formal investigation under the CMA’s competition or consumer enforcement powers nor is it a formal market study or market investigation. It is being carried out under the CMA’s general review function, under section 5 of the Enterprise Act 2002 and is also separate to the CMA’s investigation into the sale of Oasis concert tickets by Ticketmaster.
CMA issues guidance on trader recommendation platforms
The CMA has also published its compliance advice for trader recommendation platforms (TRPs). The purpose of the compliance advice is to help TRPs comply with their legal obligations. It sets out six principles, followed by practical illustrative examples, which TRPs should comply with. Increased compliance enhances consumer protection and maintain a fair and transparent trading environment among TRPs when presenting and supplying their services to consumers in the UK. Alongside this compliance advice, the CMA has published a ‘summary at a glance’. This is a snapshot of the six key principles TRPs should follow, together with some examples of ‘dos’ and ‘don’ts’, that should be read alongside the compliance advice. The CMA has also published a response to its public consultation on the draft compliance advice including details, where relevant, of the changes made to the draft as a consequence of comments received. The CMA has also published practical tips for consumers to help them when using TRPs to find a reliable trader.
EU law
First Draft of the General-Purpose AI Code of Practice published
The first draft of the General-Purpose AI Code of Practice has been published. The AI Office is facilitating relevant AI Act understanding with dedicated questions and answers for stakeholders. The final document will play a crucial role in guiding the future development and deployment of trustworthy and safe general-purpose AI models. It will describe transparency and copyright-related rules for providers of general-purpose AI models. For a small number of providers of most advanced general-purpose AI models that could pose systemic risks, the Code will also set out a taxonomy of systemic risks, risk assessment measures, as well as technical and governance mitigation measures. The drafting principles for the Code stress that measures, sub-measures, and KPIs should be proportionate to the risks, consider the size of the general-purpose AI model provider, and allow simplified compliance options for SMEs and start-ups. Following the passing of the AI Act, the Code will also reflect notable exemptions for providers of open-source models. The principles also highlight the need for a balance between clear requirements and flexibility to adapt as technology evolves.
European Commission launches consultation on AI Act prohibitions and AI system definition
The Commission’s Artificial Intelligence Office is consulting on the future guidelines on the AI system definition and the implementation of AI practices that pose unacceptable risks under the AI Act. The guidelines will help national competent authorities as well as providers and deployers in complying with the AI Act’s rules on such AI practices before they come into effect on 2 February 2025. The AI Office invites stakeholders, including AI systems providers, businesses, national authorities, academia, research institutions and civil society to submit their input. The contributions received will feed into the Commission’s guidelines on the definition of AI system and prohibited AI practices under the AI Act, to be published in early 2025. The legal concepts regarding the AI system definition and prohibited AI practices are established in the AI Act. The consultation seeks additional practical examples for the guidelines and to provide further clarity on practical aspects and use cases. The consultation ends on 11 December 2024.
European Commission announces that Booking.com must now comply with the Digital Markets Act
Booking Holdings Inc was designated as gatekeeper on 13 May 2024 and must now ensure that its online intermediation service, Booking.com, complies with all relevant obligations of the Digital Markets Act. So-called “parity” clauses are prohibited by the DMA. Therefore, hotels, car rental and other service providers using Booking.com are now free to offer different (including better) prices and conditions on their own website or other channels than they do on Booking.com. Booking must not introduce other measures with the same effect as parity clauses. For example, Booking.com is not allowed to increase commission rates or de-list offers of business users if they provide different prices on another website than they do on Booking.com. Hotels and other travel services will have real-time and continuous access to data that they and their customers generate using Booking.com. Business users can also now choose to transfer the data they generated on Booking.com to alternative platforms. Booking.com is required to demonstrate its full and effective compliance with the DMA by outlining the measures undertaken in a compliance report. Additionally, Booking.com has submitted to the Commission an independently audited description of techniques it uses for profiling consumers, along with a non-confidential version of the consumer profiling reports. Finally, the Commission requires Booking.com to keep any documents and information which might be relevant to assess and monitor effective implementation of and compliance with the DMA. The Commission will now analyse the compliance report and assess whether the implemented measures are effective in achieving the objectives of the relevant obligations under the DMA.