The European Parliament has agreed its position on the draft Digital Services Act which contains measures to tackle illegal content, to ensure platforms are held accountable for their algorithms, and improve content moderation. The text will be used as the mandate to negotiate with the French presidency of the Council, representing member states.
Removing illegal content and preventing the spread of disinformation
The proposed Digital Services Act defines clear responsibilities and accountability for providers of intermediary services, and in particular online platforms, such as social media and marketplaces.
It also establishes a “notice and action” mechanism, as well as provisions about the removal of illegal products, services or content online. The proposal says that providers of hosting services should act on receipt of such a notice “without undue delay, taking into account the type of illegal content that is being notified and the urgency of taking action”. The text also includes stronger safeguards to ensure notices are processed in a non-arbitrary and non-discriminatory manner and with respect for fundamental rights, including the freedom of expression.
The text also requires online marketplaces to ensure that consumers can purchase safe products online, strengthening the obligation to trace traders (the “Know Your Business Customer” principle).
Additional obligations for very large platforms
Very large online platforms (VLOPs) will be subject to specific obligations due to the particular risks they pose regarding the dissemination of both illegal and harmful content. The DSA would help to tackle harmful content (which might not be illegal) and the spread of disinformation by including provisions on mandatory risk assessments, risk mitigation measures, independent audits and the transparency of so-called “recommender systems” (algorithms that determine what users see).
Other key points
Parliament introduced several changes to the European’s Commission proposal, including on:
- exempting micro and small enterprises from certain DSA obligations;
- targeted advertising: the text provides for more transparent and informed choice for the recipients of digital services, including information on how their data will be monetised. Refusing consent should be no more difficult or time-consuming to the recipient than giving consent. If their consent is refused or withdrawn, recipients shall be given other options to access the online platform, including “options based on tracking-free advertising”;
- targeting or amplification techniques involving the data of minors for the purpose of displaying ads will be prohibited, as well as targeting individuals on the basis of special categories of data which allow for targeting vulnerable groups;
- compensation: recipients of digital services and organisations representing them must be able to seek redress for any damages resulting from platforms not respecting their due diligence obligations;
- online platforms should be prohibited from using deceiving or nudging techniques to influence users’ behaviour through “dark patterns”;
- more choice on algorithm-based ranking: VLOPs should provide at least one recommender system that is not based on profiling.
Further amendments approved in plenary relate to the need for providers to respect in their terms and conditions the freedom of expression and freedom and pluralism of the media, as well as a new provision on the right to use and pay for digital services anonymously.
The EU consumer group BEUC has issued its opinion on the Parliament’s position, saying that “it fails to hit the heights needed to transform the situation fully to address consumers’ needs despite some improvements. The Parliament should have supported a full ban on surveillance ads. These ads exacerbate an epidemic of problems online, such as disinformation and manipulation of consumers’ choices. Banning ads that track minors is good, but the ban should have been extended to all consumers.”