The Advocate General has issued an opinion that Facebook can be required by a court to seek and identify all comments identical to a defamatory comment that has been found to be illegal, and equivalent comments to the extent that they originate from the same user.
According to Advocate General Szpunar in Case C-18/18 Eva Glawischnig-Piesczek v Facebook Ireland Limited, Facebook can be required by a court to seek and identify all comments identical to a defamatory comment that has been found to be illegal, and equivalent comments to the extent that they originate from the same user.
In this case, the EU law relied on does not address whether Facebook can be ordered to delete the comments at issue worldwide.
Facts of case
An Austrian politician (GP) applied to the Austrian courts for an injunction to be issued requiring Facebook to end the publication of a defamatory comment.
A Facebook user had shared an article from a news website on their personal page. A thumbnail was generated on Facebook containing the title, a brief summary of the article and a photograph of GP. The user also published a disparaging comment about GP alongside the article. This could be accessed by any Facebook user.
As Facebook did not respond to her request for that comment to be deleted, Ms GP sought a court order requiring Facebook to cease publication and/or dissemination of photographs of GP if the accompanying message disseminated the same allegations as the comment in question and/or ‘equivalent content’.
The first instance court granted the order, and Facebook disabled access in Austria to the content initially published. The case was ultimately brought to the Austrian supreme court. It considered that the statements at issue were intended to damage the reputation of GP, to insult and defame her. The court had been asked to rule on the question whether the injunction could also be extended, worldwide, to statements with identical wording and/or having equivalent content of which Facebook was not aware. As a result, it referred the case to the Court of Justice.
Relevant law
Under Directive 2000/31/EC, a host provider (which includes an operator of a social network platform, such as Facebook) is, in principle, not liable for the information stored on its servers by third parties if it is not aware of the illegal nature of that information. However, once it is aware of the illegality, the host provider must delete that information or block access to it. The Directive also provides that a host provider cannot be placed under a general obligation to monitor the information which it stores. Nor can it be placed under a general obligation actively to seek facts or circumstances indicating illegal activity.
The AG’s opinion
The Advocate General considered that the Directive does not prevent a host provider which operates a social network platform from being ordered, in the context of an injunction, to seek and identify, among all the information disseminated by users of that platform, the information identical to the information that has been characterised as illegal by a court that issued that injunction.
This ensures a fair balance between the fundamental rights involved, including the protection of private life and personality rights, the protection of freedom to conduct a business, and the protection of freedom of expression and information.
Firstly, it does not require sophisticated techniques that might represent an extraordinary burden.
Secondly, in view of the ease with which information can be reproduced online, this approach is necessary to ensure the effective protection of privacy and personality rights. In the context of the injunction, the host provider may also be ordered to seek and identify information equivalent to that characterised as illegal, but only among the information disseminated by the user who originally disseminated that illegal information. A court considering the removal of such equivalent information must ensure that the effects of its injunction are clear, precise and foreseeable. In doing so, it must weigh up the fundamental rights involved and take account of the principle of proportionality.
An obligation to identify equivalent information originating from any user would not ensure a fair balance between the fundamental rights concerned. On the one hand, seeking and identifying such information would require costly solutions. On the other hand, the implementation of those solutions would lead to censorship, so that freedom of expression and information might well be systematically restricted.
In addition, as the Directive does not regulate the territorial scope of an obligation to remove information disseminated via a social network platform, it does not prevent a host provider from being required to remove such information worldwide. The territorial scope was not regulated by other provisions of EU law. This was because in this case GP was not relying on EU law but on the general provisions of Austrian civil law relating to breach of privacy and of personality rights, including defamation, which have not been harmonised. Both the question of the extraterritorial effects of an injunction imposing a removal obligation, and the question of the territorial scope of such an obligation should be analysed, in particular, by reference to public and private international law.
In addition, the Advocate General considers that the Directive does not prevent a host provider from being required to remove information equivalent to the information characterised as illegal, where it has been made aware of that equivalent information by the person concerned, third parties or another source, as, in that case, the removal obligation does not entail general monitoring of information stored.